Oct 11 03:55:07 crc systemd[1]: Starting Kubernetes Kubelet... Oct 11 03:55:07 crc restorecon[4666]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 11 03:55:07 crc restorecon[4666]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 11 03:55:08 crc restorecon[4666]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 11 03:55:08 crc restorecon[4666]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Oct 11 03:55:09 crc kubenswrapper[4703]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 11 03:55:09 crc kubenswrapper[4703]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Oct 11 03:55:09 crc kubenswrapper[4703]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 11 03:55:09 crc kubenswrapper[4703]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 11 03:55:09 crc kubenswrapper[4703]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Oct 11 03:55:09 crc kubenswrapper[4703]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.235753 4703 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.245092 4703 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.245139 4703 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.245153 4703 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.245170 4703 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.245183 4703 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.245195 4703 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.245206 4703 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.245218 4703 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.245229 4703 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.245269 4703 feature_gate.go:330] unrecognized feature gate: Example Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.245280 4703 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.245291 4703 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.245302 4703 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.245313 4703 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.245323 4703 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.245333 4703 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.245340 4703 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.245348 4703 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.245356 4703 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.245364 4703 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.245372 4703 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.245384 4703 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.245397 4703 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.245407 4703 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.245417 4703 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.245426 4703 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.245434 4703 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.245443 4703 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.245451 4703 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.245459 4703 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.245505 4703 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.245532 4703 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.245540 4703 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.245550 4703 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.245560 4703 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.245570 4703 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.245578 4703 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.245585 4703 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.245596 4703 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.245606 4703 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.245617 4703 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.245627 4703 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.245640 4703 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.245651 4703 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.245661 4703 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.245671 4703 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.245683 4703 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.245693 4703 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.245704 4703 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.245714 4703 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.245725 4703 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.245736 4703 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.245747 4703 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.245761 4703 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.245771 4703 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.245786 4703 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.245800 4703 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.245811 4703 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.245821 4703 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.245831 4703 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.245843 4703 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.245854 4703 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.245864 4703 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.245874 4703 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.245883 4703 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.245892 4703 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.245901 4703 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.245911 4703 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.245919 4703 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.245927 4703 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.245935 4703 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.247183 4703 flags.go:64] FLAG: --address="0.0.0.0" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.247216 4703 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.247242 4703 flags.go:64] FLAG: --anonymous-auth="true" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.247258 4703 flags.go:64] FLAG: --application-metrics-count-limit="100" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.247275 4703 flags.go:64] FLAG: --authentication-token-webhook="false" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.247287 4703 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.247303 4703 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.247317 4703 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.247328 4703 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.247337 4703 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.247349 4703 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.247360 4703 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.247370 4703 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.247381 4703 flags.go:64] FLAG: --cgroup-root="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.247392 4703 flags.go:64] FLAG: --cgroups-per-qos="true" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.247403 4703 flags.go:64] FLAG: --client-ca-file="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.247414 4703 flags.go:64] FLAG: --cloud-config="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.247423 4703 flags.go:64] FLAG: --cloud-provider="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.247432 4703 flags.go:64] FLAG: --cluster-dns="[]" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.247444 4703 flags.go:64] FLAG: --cluster-domain="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.247453 4703 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.247499 4703 flags.go:64] FLAG: --config-dir="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.247509 4703 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.247522 4703 flags.go:64] FLAG: --container-log-max-files="5" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.247537 4703 flags.go:64] FLAG: --container-log-max-size="10Mi" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.247547 4703 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.247557 4703 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.247567 4703 flags.go:64] FLAG: --containerd-namespace="k8s.io" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.247576 4703 flags.go:64] FLAG: --contention-profiling="false" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.247586 4703 flags.go:64] FLAG: --cpu-cfs-quota="true" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.247595 4703 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.247605 4703 flags.go:64] FLAG: --cpu-manager-policy="none" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.247614 4703 flags.go:64] FLAG: --cpu-manager-policy-options="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.247627 4703 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.247636 4703 flags.go:64] FLAG: --enable-controller-attach-detach="true" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.247645 4703 flags.go:64] FLAG: --enable-debugging-handlers="true" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.247654 4703 flags.go:64] FLAG: --enable-load-reader="false" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.247664 4703 flags.go:64] FLAG: --enable-server="true" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.247672 4703 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.247685 4703 flags.go:64] FLAG: --event-burst="100" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.247695 4703 flags.go:64] FLAG: --event-qps="50" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.247705 4703 flags.go:64] FLAG: --event-storage-age-limit="default=0" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.247715 4703 flags.go:64] FLAG: --event-storage-event-limit="default=0" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.247724 4703 flags.go:64] FLAG: --eviction-hard="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.247736 4703 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.247745 4703 flags.go:64] FLAG: --eviction-minimum-reclaim="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.247754 4703 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.247764 4703 flags.go:64] FLAG: --eviction-soft="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.247773 4703 flags.go:64] FLAG: --eviction-soft-grace-period="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.247782 4703 flags.go:64] FLAG: --exit-on-lock-contention="false" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.247793 4703 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.247804 4703 flags.go:64] FLAG: --experimental-mounter-path="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.247815 4703 flags.go:64] FLAG: --fail-cgroupv1="false" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.247827 4703 flags.go:64] FLAG: --fail-swap-on="true" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.247839 4703 flags.go:64] FLAG: --feature-gates="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.247856 4703 flags.go:64] FLAG: --file-check-frequency="20s" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.247868 4703 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.247881 4703 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.247892 4703 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.247901 4703 flags.go:64] FLAG: --healthz-port="10248" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.247911 4703 flags.go:64] FLAG: --help="false" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.247922 4703 flags.go:64] FLAG: --hostname-override="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.247933 4703 flags.go:64] FLAG: --housekeeping-interval="10s" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.247946 4703 flags.go:64] FLAG: --http-check-frequency="20s" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.247958 4703 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.247969 4703 flags.go:64] FLAG: --image-credential-provider-config="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.247981 4703 flags.go:64] FLAG: --image-gc-high-threshold="85" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.247992 4703 flags.go:64] FLAG: --image-gc-low-threshold="80" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.248004 4703 flags.go:64] FLAG: --image-service-endpoint="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.248014 4703 flags.go:64] FLAG: --kernel-memcg-notification="false" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.248023 4703 flags.go:64] FLAG: --kube-api-burst="100" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.248032 4703 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.248044 4703 flags.go:64] FLAG: --kube-api-qps="50" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.248055 4703 flags.go:64] FLAG: --kube-reserved="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.248067 4703 flags.go:64] FLAG: --kube-reserved-cgroup="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.248078 4703 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.248090 4703 flags.go:64] FLAG: --kubelet-cgroups="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.248102 4703 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.248116 4703 flags.go:64] FLAG: --lock-file="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.248127 4703 flags.go:64] FLAG: --log-cadvisor-usage="false" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.248136 4703 flags.go:64] FLAG: --log-flush-frequency="5s" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.248146 4703 flags.go:64] FLAG: --log-json-info-buffer-size="0" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.248165 4703 flags.go:64] FLAG: --log-json-split-stream="false" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.248177 4703 flags.go:64] FLAG: --log-text-info-buffer-size="0" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.248189 4703 flags.go:64] FLAG: --log-text-split-stream="false" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.248200 4703 flags.go:64] FLAG: --logging-format="text" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.248213 4703 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.248226 4703 flags.go:64] FLAG: --make-iptables-util-chains="true" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.248238 4703 flags.go:64] FLAG: --manifest-url="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.248250 4703 flags.go:64] FLAG: --manifest-url-header="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.248267 4703 flags.go:64] FLAG: --max-housekeeping-interval="15s" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.248279 4703 flags.go:64] FLAG: --max-open-files="1000000" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.248295 4703 flags.go:64] FLAG: --max-pods="110" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.248307 4703 flags.go:64] FLAG: --maximum-dead-containers="-1" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.248319 4703 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.248333 4703 flags.go:64] FLAG: --memory-manager-policy="None" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.248345 4703 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.248358 4703 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.248371 4703 flags.go:64] FLAG: --node-ip="192.168.126.11" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.248383 4703 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.248415 4703 flags.go:64] FLAG: --node-status-max-images="50" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.248427 4703 flags.go:64] FLAG: --node-status-update-frequency="10s" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.248439 4703 flags.go:64] FLAG: --oom-score-adj="-999" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.248452 4703 flags.go:64] FLAG: --pod-cidr="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.248497 4703 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.248520 4703 flags.go:64] FLAG: --pod-manifest-path="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.248532 4703 flags.go:64] FLAG: --pod-max-pids="-1" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.248544 4703 flags.go:64] FLAG: --pods-per-core="0" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.248557 4703 flags.go:64] FLAG: --port="10250" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.248568 4703 flags.go:64] FLAG: --protect-kernel-defaults="false" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.248580 4703 flags.go:64] FLAG: --provider-id="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.248592 4703 flags.go:64] FLAG: --qos-reserved="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.248603 4703 flags.go:64] FLAG: --read-only-port="10255" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.248615 4703 flags.go:64] FLAG: --register-node="true" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.248629 4703 flags.go:64] FLAG: --register-schedulable="true" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.248640 4703 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.248662 4703 flags.go:64] FLAG: --registry-burst="10" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.248673 4703 flags.go:64] FLAG: --registry-qps="5" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.248698 4703 flags.go:64] FLAG: --reserved-cpus="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.248707 4703 flags.go:64] FLAG: --reserved-memory="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.248724 4703 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.248734 4703 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.248743 4703 flags.go:64] FLAG: --rotate-certificates="false" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.248752 4703 flags.go:64] FLAG: --rotate-server-certificates="false" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.248762 4703 flags.go:64] FLAG: --runonce="false" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.248771 4703 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.248780 4703 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.248790 4703 flags.go:64] FLAG: --seccomp-default="false" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.248798 4703 flags.go:64] FLAG: --serialize-image-pulls="true" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.248808 4703 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.248817 4703 flags.go:64] FLAG: --storage-driver-db="cadvisor" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.248827 4703 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.248836 4703 flags.go:64] FLAG: --storage-driver-password="root" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.248846 4703 flags.go:64] FLAG: --storage-driver-secure="false" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.248855 4703 flags.go:64] FLAG: --storage-driver-table="stats" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.248864 4703 flags.go:64] FLAG: --storage-driver-user="root" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.248873 4703 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.248883 4703 flags.go:64] FLAG: --sync-frequency="1m0s" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.248892 4703 flags.go:64] FLAG: --system-cgroups="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.248900 4703 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.248915 4703 flags.go:64] FLAG: --system-reserved-cgroup="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.248924 4703 flags.go:64] FLAG: --tls-cert-file="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.248933 4703 flags.go:64] FLAG: --tls-cipher-suites="[]" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.248944 4703 flags.go:64] FLAG: --tls-min-version="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.248953 4703 flags.go:64] FLAG: --tls-private-key-file="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.248962 4703 flags.go:64] FLAG: --topology-manager-policy="none" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.248971 4703 flags.go:64] FLAG: --topology-manager-policy-options="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.248980 4703 flags.go:64] FLAG: --topology-manager-scope="container" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.248989 4703 flags.go:64] FLAG: --v="2" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.249001 4703 flags.go:64] FLAG: --version="false" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.249016 4703 flags.go:64] FLAG: --vmodule="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.249027 4703 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.249038 4703 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.249277 4703 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.249288 4703 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.249297 4703 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.249306 4703 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.249315 4703 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.249323 4703 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.249331 4703 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.249339 4703 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.249347 4703 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.249355 4703 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.249363 4703 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.249370 4703 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.249379 4703 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.249386 4703 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.249395 4703 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.249404 4703 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.249412 4703 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.249421 4703 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.249430 4703 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.249439 4703 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.249450 4703 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.249460 4703 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.249499 4703 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.249507 4703 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.249515 4703 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.249524 4703 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.249531 4703 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.249540 4703 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.249547 4703 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.249555 4703 feature_gate.go:330] unrecognized feature gate: Example Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.249564 4703 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.249572 4703 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.249579 4703 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.249592 4703 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.249603 4703 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.249613 4703 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.249623 4703 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.249632 4703 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.249640 4703 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.249649 4703 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.249658 4703 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.249666 4703 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.249675 4703 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.249683 4703 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.249691 4703 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.249699 4703 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.249707 4703 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.249714 4703 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.249722 4703 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.249730 4703 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.249737 4703 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.249745 4703 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.249753 4703 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.249760 4703 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.249768 4703 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.249775 4703 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.249783 4703 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.249791 4703 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.249798 4703 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.249806 4703 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.249820 4703 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.249828 4703 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.249835 4703 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.249843 4703 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.249851 4703 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.249859 4703 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.249867 4703 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.249878 4703 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.249887 4703 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.249896 4703 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.249904 4703 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.249917 4703 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.266839 4703 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.266895 4703 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.267039 4703 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.267062 4703 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.267072 4703 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.267080 4703 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.267089 4703 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.267097 4703 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.267105 4703 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.267114 4703 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.267125 4703 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.267139 4703 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.267149 4703 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.267157 4703 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.267165 4703 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.267173 4703 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.267181 4703 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.267191 4703 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.267201 4703 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.267210 4703 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.267218 4703 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.267227 4703 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.267234 4703 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.267242 4703 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.267250 4703 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.267258 4703 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.267266 4703 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.267274 4703 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.267281 4703 feature_gate.go:330] unrecognized feature gate: Example Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.267290 4703 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.267298 4703 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.267306 4703 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.267314 4703 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.267322 4703 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.267330 4703 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.267338 4703 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.267347 4703 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.267355 4703 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.267362 4703 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.267370 4703 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.267378 4703 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.267386 4703 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.267393 4703 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.267401 4703 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.267409 4703 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.267417 4703 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.267425 4703 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.267433 4703 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.267441 4703 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.267449 4703 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.267457 4703 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.267497 4703 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.267505 4703 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.267513 4703 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.267521 4703 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.267529 4703 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.267537 4703 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.267545 4703 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.267553 4703 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.267560 4703 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.267568 4703 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.267576 4703 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.267584 4703 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.267591 4703 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.267599 4703 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.267607 4703 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.267615 4703 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.267623 4703 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.267633 4703 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.267643 4703 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.267651 4703 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.267662 4703 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.267673 4703 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.267687 4703 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.267935 4703 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.267950 4703 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.267958 4703 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.267967 4703 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.267978 4703 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.267990 4703 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.267999 4703 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.268008 4703 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.268016 4703 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.268025 4703 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.268034 4703 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.268043 4703 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.268054 4703 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.268064 4703 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.268073 4703 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.268083 4703 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.268093 4703 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.268103 4703 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.268112 4703 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.268120 4703 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.268128 4703 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.268136 4703 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.268143 4703 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.268152 4703 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.268159 4703 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.268167 4703 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.268174 4703 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.268182 4703 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.268190 4703 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.268198 4703 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.268206 4703 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.268213 4703 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.268221 4703 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.268229 4703 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.268238 4703 feature_gate.go:330] unrecognized feature gate: Example Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.268246 4703 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.268253 4703 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.268261 4703 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.268269 4703 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.268277 4703 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.268284 4703 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.268292 4703 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.268300 4703 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.268308 4703 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.268317 4703 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.268325 4703 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.268333 4703 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.268340 4703 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.268348 4703 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.268357 4703 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.268365 4703 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.268372 4703 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.268380 4703 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.268388 4703 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.268396 4703 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.268404 4703 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.268411 4703 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.268419 4703 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.268426 4703 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.268437 4703 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.268447 4703 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.268455 4703 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.268488 4703 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.268498 4703 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.268506 4703 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.268515 4703 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.268523 4703 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.268531 4703 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.268539 4703 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.268547 4703 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.268556 4703 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.268569 4703 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.268882 4703 server.go:940] "Client rotation is on, will bootstrap in background" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.275656 4703 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.275798 4703 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.277545 4703 server.go:997] "Starting client certificate rotation" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.277591 4703 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.277858 4703 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2025-11-19 11:14:20.019571907 +0000 UTC Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.278002 4703 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 943h19m10.741581286s for next certificate rotation Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.305750 4703 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.308681 4703 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.329505 4703 log.go:25] "Validated CRI v1 runtime API" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.374896 4703 log.go:25] "Validated CRI v1 image API" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.378342 4703 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.386982 4703 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2025-10-11-03-51-11-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.387063 4703 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.420950 4703 manager.go:217] Machine: {Timestamp:2025-10-11 03:55:09.416619717 +0000 UTC m=+0.627101709 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2799998 MemoryCapacity:33654120448 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:243e2b7e-609f-4e6f-ab38-53c6a8452606 BootID:6e083b1c-5c9b-4402-b8e3-8c3311b0c688 Filesystems:[{Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365408768 Type:vfs Inodes:821633 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108169 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827060224 Type:vfs Inodes:4108169 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827060224 Type:vfs Inodes:1048576 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:b4:ee:f5 Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:b4:ee:f5 Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:f7:d7:ce Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:ff:ec:74 Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:7d:52:0d Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:82:fc:5d Speed:-1 Mtu:1496} {Name:eth10 MacAddress:ce:40:df:72:01:73 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:3e:b5:38:8f:da:c4 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654120448 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.421372 4703 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.421766 4703 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.424167 4703 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.424566 4703 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.424631 4703 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.425027 4703 topology_manager.go:138] "Creating topology manager with none policy" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.425047 4703 container_manager_linux.go:303] "Creating device plugin manager" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.425603 4703 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.425669 4703 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.426010 4703 state_mem.go:36] "Initialized new in-memory state store" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.426173 4703 server.go:1245] "Using root directory" path="/var/lib/kubelet" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.429864 4703 kubelet.go:418] "Attempting to sync node with API server" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.429912 4703 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.429952 4703 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.429975 4703 kubelet.go:324] "Adding apiserver pod source" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.429994 4703 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.434675 4703 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.436109 4703 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.436712 4703 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.162:6443: connect: connection refused Oct 11 03:55:09 crc kubenswrapper[4703]: E1011 03:55:09.436883 4703 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.162:6443: connect: connection refused" logger="UnhandledError" Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.436893 4703 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.162:6443: connect: connection refused Oct 11 03:55:09 crc kubenswrapper[4703]: E1011 03:55:09.437145 4703 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.162:6443: connect: connection refused" logger="UnhandledError" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.438710 4703 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.440361 4703 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.440491 4703 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.440517 4703 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.440531 4703 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.440552 4703 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.440565 4703 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.440578 4703 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.440600 4703 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.440617 4703 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.440631 4703 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.440666 4703 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.440679 4703 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.441795 4703 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.442502 4703 server.go:1280] "Started kubelet" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.445742 4703 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.162:6443: connect: connection refused Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.446070 4703 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.446096 4703 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.447001 4703 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.447289 4703 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Oct 11 03:55:09 crc systemd[1]: Started Kubernetes Kubelet. Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.449748 4703 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.450858 4703 volume_manager.go:287] "The desired_state_of_world populator starts" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.450909 4703 volume_manager.go:289] "Starting Kubelet Volume Manager" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.450123 4703 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-23 20:10:32.572381579 +0000 UTC Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.451069 4703 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 1768h15m23.121323697s for next certificate rotation Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.451294 4703 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Oct 11 03:55:09 crc kubenswrapper[4703]: E1011 03:55:09.451664 4703 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.460235 4703 factory.go:55] Registering systemd factory Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.460292 4703 factory.go:221] Registration of the systemd container factory successfully Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.460687 4703 server.go:460] "Adding debug handlers to kubelet server" Oct 11 03:55:09 crc kubenswrapper[4703]: E1011 03:55:09.461526 4703 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.162:6443: connect: connection refused" interval="200ms" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.463285 4703 factory.go:153] Registering CRI-O factory Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.463357 4703 factory.go:221] Registration of the crio container factory successfully Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.463756 4703 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.463846 4703 factory.go:103] Registering Raw factory Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.463875 4703 manager.go:1196] Started watching for new ooms in manager Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.464012 4703 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.162:6443: connect: connection refused Oct 11 03:55:09 crc kubenswrapper[4703]: E1011 03:55:09.464273 4703 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.162:6443: connect: connection refused" logger="UnhandledError" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.465531 4703 manager.go:319] Starting recovery of all containers Oct 11 03:55:09 crc kubenswrapper[4703]: E1011 03:55:09.464370 4703 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.162:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.186d53907ef9e144 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-10-11 03:55:09.442433348 +0000 UTC m=+0.652915300,LastTimestamp:2025-10-11 03:55:09.442433348 +0000 UTC m=+0.652915300,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.477629 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.478099 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.478140 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.478165 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.478189 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.478215 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.478311 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.478338 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.478401 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.478431 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.478456 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.478605 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.478630 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.478659 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.478681 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.478706 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.478812 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.478877 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.478906 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.478931 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.478957 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.478984 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.479010 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.479035 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.479059 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.479089 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.479119 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.479145 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.479172 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.479198 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.479222 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.479263 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.479292 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.479317 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.479343 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.479373 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.479402 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.479427 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.479455 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.479521 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.479549 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.479578 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.479602 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.479627 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.479652 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.479676 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.479700 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.479727 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.479756 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.479783 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.479809 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.479833 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.479870 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.479899 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.479927 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.479952 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.479979 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.480002 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.480027 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.480063 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.480090 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.480114 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.480137 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.480163 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.480189 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.480214 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.480238 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.480260 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.480283 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.480304 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.480397 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.480430 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.480456 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.480514 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.480538 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.480561 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.480624 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.480648 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.480670 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.480696 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.480733 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.480757 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.480781 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.480804 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.480829 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.480855 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.480879 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.480902 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.481809 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.481863 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.481905 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.482105 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.483626 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.483814 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.484004 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.484185 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.484228 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.486982 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.487036 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.487053 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.487082 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.487095 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.487112 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.487124 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.487152 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.487173 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.487195 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.487219 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.487234 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.487250 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.487265 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.487286 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.487332 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.487352 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.487365 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.487376 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.487394 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.487404 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.487420 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.487432 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.487443 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.487456 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.487484 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.487520 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.487531 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.487541 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.487554 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.487565 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.487580 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.487591 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.487603 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.487617 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.487629 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.487642 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.487652 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.487663 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.487677 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.487687 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.487701 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.487713 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.487725 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.487758 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.487768 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.487779 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.487795 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.487808 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.487825 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.487837 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.487847 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.487860 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.487870 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.487883 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.487892 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.487903 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.487918 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.487932 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.487946 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.487957 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.487967 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.487980 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.487991 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.488001 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.488015 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.488025 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.488038 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.488049 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.488061 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.488076 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.488086 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.488100 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.488112 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.488123 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.488135 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.488146 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.488160 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.488170 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.488181 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.488193 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.488210 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.488224 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.488235 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.488255 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.488273 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.488284 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.488300 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.488311 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.488323 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.488336 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.488347 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.488359 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.488373 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.488384 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.488397 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.488406 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.488416 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.488429 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.488439 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.488452 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.488495 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.490675 4703 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.490734 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.490755 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.490772 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.490831 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.490853 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.490876 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.490895 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.490961 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.490974 4703 reconstruct.go:97] "Volume reconstruction finished" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.490988 4703 reconciler.go:26] "Reconciler: start to sync state" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.500204 4703 manager.go:324] Recovery completed Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.511707 4703 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.513694 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.513739 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.513756 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.515021 4703 cpu_manager.go:225] "Starting CPU manager" policy="none" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.515048 4703 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.515073 4703 state_mem.go:36] "Initialized new in-memory state store" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.529016 4703 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.532210 4703 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.532257 4703 status_manager.go:217] "Starting to sync pod status with apiserver" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.532295 4703 kubelet.go:2335] "Starting kubelet main sync loop" Oct 11 03:55:09 crc kubenswrapper[4703]: E1011 03:55:09.532353 4703 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Oct 11 03:55:09 crc kubenswrapper[4703]: W1011 03:55:09.533155 4703 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.162:6443: connect: connection refused Oct 11 03:55:09 crc kubenswrapper[4703]: E1011 03:55:09.533250 4703 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.162:6443: connect: connection refused" logger="UnhandledError" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.538789 4703 policy_none.go:49] "None policy: Start" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.539795 4703 memory_manager.go:170] "Starting memorymanager" policy="None" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.539834 4703 state_mem.go:35] "Initializing new in-memory state store" Oct 11 03:55:09 crc kubenswrapper[4703]: E1011 03:55:09.552561 4703 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.589772 4703 manager.go:334] "Starting Device Plugin manager" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.590066 4703 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.590092 4703 server.go:79] "Starting device plugin registration server" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.590637 4703 eviction_manager.go:189] "Eviction manager: starting control loop" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.590661 4703 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.590920 4703 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.591024 4703 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.591034 4703 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Oct 11 03:55:09 crc kubenswrapper[4703]: E1011 03:55:09.600329 4703 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.632556 4703 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc","openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc"] Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.632705 4703 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.634629 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.634675 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.634716 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.634955 4703 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.635381 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.635457 4703 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.636335 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.636394 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.636412 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.636608 4703 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.636795 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.636851 4703 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.638097 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.638131 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.638143 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.638146 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.638201 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.638214 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.638265 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.638293 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.638309 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.638439 4703 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.638482 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.638514 4703 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.639737 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.639768 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.639779 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.640227 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.640264 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.640277 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.640434 4703 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.640566 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.640602 4703 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.641597 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.641623 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.641634 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.641608 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.641681 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.641716 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.641863 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.641884 4703 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.642855 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.642890 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.642901 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 03:55:09 crc kubenswrapper[4703]: E1011 03:55:09.663235 4703 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.162:6443: connect: connection refused" interval="400ms" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.690792 4703 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.691898 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.691980 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.691994 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.692021 4703 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 11 03:55:09 crc kubenswrapper[4703]: E1011 03:55:09.692535 4703 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.162:6443: connect: connection refused" node="crc" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.697424 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.698201 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.699211 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.699935 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.700003 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.700062 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.700114 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.700445 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.700819 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.700887 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.700916 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.700958 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.700977 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.700995 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.701034 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.802225 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.802297 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.802338 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.802392 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.802427 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.802492 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.802509 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.802584 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.802694 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.802527 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.802639 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.802727 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.802752 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.802827 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.802833 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.802910 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.802941 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.802972 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.803010 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.803044 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.803006 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.803075 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.802868 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.803140 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.803079 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.803201 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.803220 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.803295 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.803329 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.803385 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.892991 4703 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.895414 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.895509 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.895531 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.895570 4703 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 11 03:55:09 crc kubenswrapper[4703]: E1011 03:55:09.896263 4703 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.162:6443: connect: connection refused" node="crc" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.970351 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 11 03:55:09 crc kubenswrapper[4703]: I1011 03:55:09.985039 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 11 03:55:10 crc kubenswrapper[4703]: I1011 03:55:10.001200 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Oct 11 03:55:10 crc kubenswrapper[4703]: I1011 03:55:10.025264 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 11 03:55:10 crc kubenswrapper[4703]: W1011 03:55:10.025517 4703 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-29c2e65bdf0e990ece07c2b226d2aab8e3f6dc992de1f3435138c9b44ffcc5bc WatchSource:0}: Error finding container 29c2e65bdf0e990ece07c2b226d2aab8e3f6dc992de1f3435138c9b44ffcc5bc: Status 404 returned error can't find the container with id 29c2e65bdf0e990ece07c2b226d2aab8e3f6dc992de1f3435138c9b44ffcc5bc Oct 11 03:55:10 crc kubenswrapper[4703]: W1011 03:55:10.027293 4703 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-5672d7c6c523f8811c7142f30e855cb4ce7acb435afaccee73989de6e331d3ec WatchSource:0}: Error finding container 5672d7c6c523f8811c7142f30e855cb4ce7acb435afaccee73989de6e331d3ec: Status 404 returned error can't find the container with id 5672d7c6c523f8811c7142f30e855cb4ce7acb435afaccee73989de6e331d3ec Oct 11 03:55:10 crc kubenswrapper[4703]: W1011 03:55:10.028894 4703 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-0bd87d6bd8d127d38aecac588c47e1cca4c39f6935b72e76709f5acd654ff42f WatchSource:0}: Error finding container 0bd87d6bd8d127d38aecac588c47e1cca4c39f6935b72e76709f5acd654ff42f: Status 404 returned error can't find the container with id 0bd87d6bd8d127d38aecac588c47e1cca4c39f6935b72e76709f5acd654ff42f Oct 11 03:55:10 crc kubenswrapper[4703]: I1011 03:55:10.030717 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 11 03:55:10 crc kubenswrapper[4703]: W1011 03:55:10.047902 4703 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-970270ba7e3f1b400a2601b2dfa177c1f6c22d90502a3cbd568550c71e3e1e84 WatchSource:0}: Error finding container 970270ba7e3f1b400a2601b2dfa177c1f6c22d90502a3cbd568550c71e3e1e84: Status 404 returned error can't find the container with id 970270ba7e3f1b400a2601b2dfa177c1f6c22d90502a3cbd568550c71e3e1e84 Oct 11 03:55:10 crc kubenswrapper[4703]: W1011 03:55:10.050580 4703 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-221a2475502404048429b597b5631406a806aa85fec11a5599683b5a57cf8132 WatchSource:0}: Error finding container 221a2475502404048429b597b5631406a806aa85fec11a5599683b5a57cf8132: Status 404 returned error can't find the container with id 221a2475502404048429b597b5631406a806aa85fec11a5599683b5a57cf8132 Oct 11 03:55:10 crc kubenswrapper[4703]: E1011 03:55:10.064079 4703 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.162:6443: connect: connection refused" interval="800ms" Oct 11 03:55:10 crc kubenswrapper[4703]: I1011 03:55:10.296690 4703 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 11 03:55:10 crc kubenswrapper[4703]: I1011 03:55:10.298262 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 03:55:10 crc kubenswrapper[4703]: I1011 03:55:10.298297 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 03:55:10 crc kubenswrapper[4703]: I1011 03:55:10.298307 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 03:55:10 crc kubenswrapper[4703]: I1011 03:55:10.298332 4703 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 11 03:55:10 crc kubenswrapper[4703]: E1011 03:55:10.298748 4703 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.162:6443: connect: connection refused" node="crc" Oct 11 03:55:10 crc kubenswrapper[4703]: I1011 03:55:10.446822 4703 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.162:6443: connect: connection refused Oct 11 03:55:10 crc kubenswrapper[4703]: I1011 03:55:10.537007 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"0bd87d6bd8d127d38aecac588c47e1cca4c39f6935b72e76709f5acd654ff42f"} Oct 11 03:55:10 crc kubenswrapper[4703]: I1011 03:55:10.537956 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"5672d7c6c523f8811c7142f30e855cb4ce7acb435afaccee73989de6e331d3ec"} Oct 11 03:55:10 crc kubenswrapper[4703]: I1011 03:55:10.538637 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"29c2e65bdf0e990ece07c2b226d2aab8e3f6dc992de1f3435138c9b44ffcc5bc"} Oct 11 03:55:10 crc kubenswrapper[4703]: I1011 03:55:10.539299 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"221a2475502404048429b597b5631406a806aa85fec11a5599683b5a57cf8132"} Oct 11 03:55:10 crc kubenswrapper[4703]: I1011 03:55:10.540514 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"970270ba7e3f1b400a2601b2dfa177c1f6c22d90502a3cbd568550c71e3e1e84"} Oct 11 03:55:10 crc kubenswrapper[4703]: W1011 03:55:10.815752 4703 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.162:6443: connect: connection refused Oct 11 03:55:10 crc kubenswrapper[4703]: E1011 03:55:10.815869 4703 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.162:6443: connect: connection refused" logger="UnhandledError" Oct 11 03:55:10 crc kubenswrapper[4703]: E1011 03:55:10.864656 4703 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.162:6443: connect: connection refused" interval="1.6s" Oct 11 03:55:10 crc kubenswrapper[4703]: W1011 03:55:10.996954 4703 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.162:6443: connect: connection refused Oct 11 03:55:10 crc kubenswrapper[4703]: E1011 03:55:10.997079 4703 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.162:6443: connect: connection refused" logger="UnhandledError" Oct 11 03:55:10 crc kubenswrapper[4703]: W1011 03:55:10.999241 4703 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.162:6443: connect: connection refused Oct 11 03:55:10 crc kubenswrapper[4703]: E1011 03:55:10.999340 4703 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.162:6443: connect: connection refused" logger="UnhandledError" Oct 11 03:55:11 crc kubenswrapper[4703]: W1011 03:55:11.032311 4703 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.162:6443: connect: connection refused Oct 11 03:55:11 crc kubenswrapper[4703]: E1011 03:55:11.032424 4703 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.162:6443: connect: connection refused" logger="UnhandledError" Oct 11 03:55:11 crc kubenswrapper[4703]: I1011 03:55:11.099260 4703 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 11 03:55:11 crc kubenswrapper[4703]: I1011 03:55:11.100956 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 03:55:11 crc kubenswrapper[4703]: I1011 03:55:11.101026 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 03:55:11 crc kubenswrapper[4703]: I1011 03:55:11.101046 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 03:55:11 crc kubenswrapper[4703]: I1011 03:55:11.101085 4703 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 11 03:55:11 crc kubenswrapper[4703]: E1011 03:55:11.101791 4703 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.162:6443: connect: connection refused" node="crc" Oct 11 03:55:11 crc kubenswrapper[4703]: I1011 03:55:11.447039 4703 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.162:6443: connect: connection refused Oct 11 03:55:11 crc kubenswrapper[4703]: I1011 03:55:11.546130 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"b4c454ab3f6ac3f4ddf483a0039e0ddcff628cc078dd736875b86becc2f51c0d"} Oct 11 03:55:11 crc kubenswrapper[4703]: I1011 03:55:11.546170 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"6798d26a9f6823db8508369a2228601bbce6ce6f60799acc543e3a9074321130"} Oct 11 03:55:11 crc kubenswrapper[4703]: I1011 03:55:11.546182 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"2c5b679c77d5f5c431108f51cdeecf410bd4f6f76c74c2580cd3c39898eec23e"} Oct 11 03:55:11 crc kubenswrapper[4703]: I1011 03:55:11.546193 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"e8a0a051a04bd848843a5b40337140b947c17d8378b0efbcef7d507ecadcd40b"} Oct 11 03:55:11 crc kubenswrapper[4703]: I1011 03:55:11.546272 4703 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 11 03:55:11 crc kubenswrapper[4703]: I1011 03:55:11.547173 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 03:55:11 crc kubenswrapper[4703]: I1011 03:55:11.547204 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 03:55:11 crc kubenswrapper[4703]: I1011 03:55:11.547215 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 03:55:11 crc kubenswrapper[4703]: I1011 03:55:11.549084 4703 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="70ea51930ef1d31afc48cc63142600207b356da9ba1b677db3319843b8b27dec" exitCode=0 Oct 11 03:55:11 crc kubenswrapper[4703]: I1011 03:55:11.549166 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"70ea51930ef1d31afc48cc63142600207b356da9ba1b677db3319843b8b27dec"} Oct 11 03:55:11 crc kubenswrapper[4703]: I1011 03:55:11.549295 4703 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 11 03:55:11 crc kubenswrapper[4703]: I1011 03:55:11.550479 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 03:55:11 crc kubenswrapper[4703]: I1011 03:55:11.550501 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 03:55:11 crc kubenswrapper[4703]: I1011 03:55:11.550511 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 03:55:11 crc kubenswrapper[4703]: I1011 03:55:11.551565 4703 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="88b96bb921624916f6ddbb515f7f6b8078050401b7afef59166d755120228b6e" exitCode=0 Oct 11 03:55:11 crc kubenswrapper[4703]: I1011 03:55:11.551647 4703 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 11 03:55:11 crc kubenswrapper[4703]: I1011 03:55:11.551644 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"88b96bb921624916f6ddbb515f7f6b8078050401b7afef59166d755120228b6e"} Oct 11 03:55:11 crc kubenswrapper[4703]: I1011 03:55:11.552676 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 03:55:11 crc kubenswrapper[4703]: I1011 03:55:11.552721 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 03:55:11 crc kubenswrapper[4703]: I1011 03:55:11.552742 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 03:55:11 crc kubenswrapper[4703]: I1011 03:55:11.555846 4703 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="b9ee1e84776076ea7dc07e029afcc880ad586fa2f195d2c88e98fe5a62210378" exitCode=0 Oct 11 03:55:11 crc kubenswrapper[4703]: I1011 03:55:11.555892 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"b9ee1e84776076ea7dc07e029afcc880ad586fa2f195d2c88e98fe5a62210378"} Oct 11 03:55:11 crc kubenswrapper[4703]: I1011 03:55:11.555965 4703 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 11 03:55:11 crc kubenswrapper[4703]: I1011 03:55:11.557274 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 03:55:11 crc kubenswrapper[4703]: I1011 03:55:11.557366 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 03:55:11 crc kubenswrapper[4703]: I1011 03:55:11.557441 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 03:55:11 crc kubenswrapper[4703]: I1011 03:55:11.561305 4703 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="fbcccfdee4565b57a78cd594f8e68b9ee6f8afb6d329818c54058ef262ae1aa7" exitCode=0 Oct 11 03:55:11 crc kubenswrapper[4703]: I1011 03:55:11.561409 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"fbcccfdee4565b57a78cd594f8e68b9ee6f8afb6d329818c54058ef262ae1aa7"} Oct 11 03:55:11 crc kubenswrapper[4703]: I1011 03:55:11.561587 4703 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 11 03:55:11 crc kubenswrapper[4703]: I1011 03:55:11.562342 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 03:55:11 crc kubenswrapper[4703]: I1011 03:55:11.562438 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 03:55:11 crc kubenswrapper[4703]: I1011 03:55:11.562545 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 03:55:11 crc kubenswrapper[4703]: I1011 03:55:11.565100 4703 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 11 03:55:11 crc kubenswrapper[4703]: I1011 03:55:11.570428 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 03:55:11 crc kubenswrapper[4703]: I1011 03:55:11.570522 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 03:55:11 crc kubenswrapper[4703]: I1011 03:55:11.570548 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 03:55:12 crc kubenswrapper[4703]: I1011 03:55:12.448208 4703 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.162:6443: connect: connection refused Oct 11 03:55:12 crc kubenswrapper[4703]: E1011 03:55:12.465672 4703 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.162:6443: connect: connection refused" interval="3.2s" Oct 11 03:55:12 crc kubenswrapper[4703]: I1011 03:55:12.566772 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"ffc93cc45b29da406226f49f7f4125a0f4a82720f9361669a5484b59de32af24"} Oct 11 03:55:12 crc kubenswrapper[4703]: I1011 03:55:12.566817 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"bd7895b51b1bd20b818fe5bb43230a8d421c23b49cd13716c2fd6dea136173db"} Oct 11 03:55:12 crc kubenswrapper[4703]: I1011 03:55:12.566829 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"bd8a9d036f439ff309de07fa7f54dfbb34c15293855a455d128270f546682d0d"} Oct 11 03:55:12 crc kubenswrapper[4703]: I1011 03:55:12.566827 4703 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 11 03:55:12 crc kubenswrapper[4703]: I1011 03:55:12.567552 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 03:55:12 crc kubenswrapper[4703]: I1011 03:55:12.567578 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 03:55:12 crc kubenswrapper[4703]: I1011 03:55:12.567589 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 03:55:12 crc kubenswrapper[4703]: I1011 03:55:12.570644 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"a6665fdb421f1acf4f7a285ccb1adcdca9e7b6191f21ed8dd1ebfa21771a0cbc"} Oct 11 03:55:12 crc kubenswrapper[4703]: I1011 03:55:12.570683 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"fa3ac63ddb093f123aeccd29dda14b9d42e3196818fe255e4315b2fd81e4d4bd"} Oct 11 03:55:12 crc kubenswrapper[4703]: I1011 03:55:12.570696 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"337d11e16f135a7996228291aa5aeb9dca3b0743c8d53ad358ad75a3f6ac604f"} Oct 11 03:55:12 crc kubenswrapper[4703]: I1011 03:55:12.570710 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"1d36cda809e465ae3cdbebd04ca1d0c880d982920b9e17e237e1c02da638b4b3"} Oct 11 03:55:12 crc kubenswrapper[4703]: I1011 03:55:12.572695 4703 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="919394b29275fabfd1d60ee2f83dbe7fb156e3183a1a70e11ea05feefd14b1fc" exitCode=0 Oct 11 03:55:12 crc kubenswrapper[4703]: I1011 03:55:12.572810 4703 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 11 03:55:12 crc kubenswrapper[4703]: I1011 03:55:12.572789 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"919394b29275fabfd1d60ee2f83dbe7fb156e3183a1a70e11ea05feefd14b1fc"} Oct 11 03:55:12 crc kubenswrapper[4703]: I1011 03:55:12.573764 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 03:55:12 crc kubenswrapper[4703]: I1011 03:55:12.573797 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 03:55:12 crc kubenswrapper[4703]: I1011 03:55:12.573811 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 03:55:12 crc kubenswrapper[4703]: I1011 03:55:12.575913 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"eb91bdee4fe97e40dbee75d1e120075b04e035f7d7c024263d3f83b3a675a403"} Oct 11 03:55:12 crc kubenswrapper[4703]: I1011 03:55:12.575976 4703 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 11 03:55:12 crc kubenswrapper[4703]: I1011 03:55:12.576083 4703 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 11 03:55:12 crc kubenswrapper[4703]: I1011 03:55:12.577803 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 03:55:12 crc kubenswrapper[4703]: I1011 03:55:12.577821 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 03:55:12 crc kubenswrapper[4703]: I1011 03:55:12.577831 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 03:55:12 crc kubenswrapper[4703]: I1011 03:55:12.578095 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 03:55:12 crc kubenswrapper[4703]: I1011 03:55:12.578141 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 03:55:12 crc kubenswrapper[4703]: I1011 03:55:12.578164 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 03:55:12 crc kubenswrapper[4703]: I1011 03:55:12.702506 4703 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 11 03:55:12 crc kubenswrapper[4703]: I1011 03:55:12.703397 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 03:55:12 crc kubenswrapper[4703]: I1011 03:55:12.703422 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 03:55:12 crc kubenswrapper[4703]: I1011 03:55:12.703432 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 03:55:12 crc kubenswrapper[4703]: I1011 03:55:12.703453 4703 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 11 03:55:12 crc kubenswrapper[4703]: E1011 03:55:12.703782 4703 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.162:6443: connect: connection refused" node="crc" Oct 11 03:55:13 crc kubenswrapper[4703]: I1011 03:55:13.581739 4703 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="59c40dd323f23df2e264cf5980fcf715e10c3a61a70e79e0d64e15edb6c6eb85" exitCode=0 Oct 11 03:55:13 crc kubenswrapper[4703]: I1011 03:55:13.581808 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"59c40dd323f23df2e264cf5980fcf715e10c3a61a70e79e0d64e15edb6c6eb85"} Oct 11 03:55:13 crc kubenswrapper[4703]: I1011 03:55:13.582506 4703 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 11 03:55:13 crc kubenswrapper[4703]: I1011 03:55:13.583889 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 03:55:13 crc kubenswrapper[4703]: I1011 03:55:13.583937 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 03:55:13 crc kubenswrapper[4703]: I1011 03:55:13.583954 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 03:55:13 crc kubenswrapper[4703]: I1011 03:55:13.586767 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"5bdb20db3996ab197ac02919be77d320bf5f7ef288276b9f0788963b19a3d818"} Oct 11 03:55:13 crc kubenswrapper[4703]: I1011 03:55:13.586781 4703 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 11 03:55:13 crc kubenswrapper[4703]: I1011 03:55:13.586859 4703 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 11 03:55:13 crc kubenswrapper[4703]: I1011 03:55:13.586939 4703 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 11 03:55:13 crc kubenswrapper[4703]: I1011 03:55:13.586865 4703 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 11 03:55:13 crc kubenswrapper[4703]: I1011 03:55:13.588269 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 03:55:13 crc kubenswrapper[4703]: I1011 03:55:13.588319 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 03:55:13 crc kubenswrapper[4703]: I1011 03:55:13.588341 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 03:55:13 crc kubenswrapper[4703]: I1011 03:55:13.588635 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 03:55:13 crc kubenswrapper[4703]: I1011 03:55:13.588656 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 03:55:13 crc kubenswrapper[4703]: I1011 03:55:13.588665 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 03:55:13 crc kubenswrapper[4703]: I1011 03:55:13.589392 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 03:55:13 crc kubenswrapper[4703]: I1011 03:55:13.589442 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 03:55:13 crc kubenswrapper[4703]: I1011 03:55:13.589459 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 03:55:14 crc kubenswrapper[4703]: I1011 03:55:14.595964 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"f06f1d874248ed7a057aefecd4656dfca8cf9738162654d21b99430a44043f01"} Oct 11 03:55:14 crc kubenswrapper[4703]: I1011 03:55:14.596055 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"84db6491b79d525739a3cc5e3010e8bca3ce454993dcffa0138df5a48e10c23e"} Oct 11 03:55:14 crc kubenswrapper[4703]: I1011 03:55:14.596084 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"5be0dd37b5745971fa972143992d08654be9680e8d97b88cfa3943eede028d7e"} Oct 11 03:55:14 crc kubenswrapper[4703]: I1011 03:55:14.596002 4703 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 11 03:55:14 crc kubenswrapper[4703]: I1011 03:55:14.596178 4703 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 11 03:55:14 crc kubenswrapper[4703]: I1011 03:55:14.597685 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 03:55:14 crc kubenswrapper[4703]: I1011 03:55:14.597748 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 03:55:14 crc kubenswrapper[4703]: I1011 03:55:14.597775 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 03:55:15 crc kubenswrapper[4703]: I1011 03:55:15.160299 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 11 03:55:15 crc kubenswrapper[4703]: I1011 03:55:15.160569 4703 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 11 03:55:15 crc kubenswrapper[4703]: I1011 03:55:15.162555 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 03:55:15 crc kubenswrapper[4703]: I1011 03:55:15.162608 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 03:55:15 crc kubenswrapper[4703]: I1011 03:55:15.162628 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 03:55:15 crc kubenswrapper[4703]: I1011 03:55:15.604236 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"529e83aeb8d2fc7a998a41b538fd0259bef10591dff39eab23e32db9fcf6e33e"} Oct 11 03:55:15 crc kubenswrapper[4703]: I1011 03:55:15.604650 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"8cc5b4716fd48e58be834428140536aae0300d6f2c90f282693463dce18ecb9f"} Oct 11 03:55:15 crc kubenswrapper[4703]: I1011 03:55:15.604407 4703 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 11 03:55:15 crc kubenswrapper[4703]: I1011 03:55:15.606166 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 03:55:15 crc kubenswrapper[4703]: I1011 03:55:15.606214 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 03:55:15 crc kubenswrapper[4703]: I1011 03:55:15.606232 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 03:55:15 crc kubenswrapper[4703]: I1011 03:55:15.817206 4703 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 11 03:55:15 crc kubenswrapper[4703]: I1011 03:55:15.817455 4703 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 11 03:55:15 crc kubenswrapper[4703]: I1011 03:55:15.817555 4703 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 11 03:55:15 crc kubenswrapper[4703]: I1011 03:55:15.819314 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 03:55:15 crc kubenswrapper[4703]: I1011 03:55:15.819350 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 03:55:15 crc kubenswrapper[4703]: I1011 03:55:15.819374 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 03:55:15 crc kubenswrapper[4703]: I1011 03:55:15.904052 4703 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 11 03:55:15 crc kubenswrapper[4703]: I1011 03:55:15.905590 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 03:55:15 crc kubenswrapper[4703]: I1011 03:55:15.905616 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 03:55:15 crc kubenswrapper[4703]: I1011 03:55:15.905626 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 03:55:15 crc kubenswrapper[4703]: I1011 03:55:15.905646 4703 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 11 03:55:16 crc kubenswrapper[4703]: I1011 03:55:16.262633 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 11 03:55:16 crc kubenswrapper[4703]: I1011 03:55:16.358776 4703 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 11 03:55:16 crc kubenswrapper[4703]: I1011 03:55:16.358985 4703 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 11 03:55:16 crc kubenswrapper[4703]: I1011 03:55:16.360857 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 03:55:16 crc kubenswrapper[4703]: I1011 03:55:16.360910 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 03:55:16 crc kubenswrapper[4703]: I1011 03:55:16.360931 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 03:55:16 crc kubenswrapper[4703]: I1011 03:55:16.373106 4703 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 11 03:55:16 crc kubenswrapper[4703]: I1011 03:55:16.606948 4703 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 11 03:55:16 crc kubenswrapper[4703]: I1011 03:55:16.607084 4703 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 11 03:55:16 crc kubenswrapper[4703]: I1011 03:55:16.607083 4703 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 11 03:55:16 crc kubenswrapper[4703]: I1011 03:55:16.608797 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 03:55:16 crc kubenswrapper[4703]: I1011 03:55:16.608859 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 03:55:16 crc kubenswrapper[4703]: I1011 03:55:16.608886 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 03:55:16 crc kubenswrapper[4703]: I1011 03:55:16.608978 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 03:55:16 crc kubenswrapper[4703]: I1011 03:55:16.609014 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 03:55:16 crc kubenswrapper[4703]: I1011 03:55:16.609028 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 03:55:16 crc kubenswrapper[4703]: I1011 03:55:16.609146 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 03:55:16 crc kubenswrapper[4703]: I1011 03:55:16.609161 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 03:55:16 crc kubenswrapper[4703]: I1011 03:55:16.609172 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 03:55:16 crc kubenswrapper[4703]: I1011 03:55:16.905115 4703 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Oct 11 03:55:17 crc kubenswrapper[4703]: I1011 03:55:17.610374 4703 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 11 03:55:17 crc kubenswrapper[4703]: I1011 03:55:17.611861 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 03:55:17 crc kubenswrapper[4703]: I1011 03:55:17.611941 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 03:55:17 crc kubenswrapper[4703]: I1011 03:55:17.611967 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 03:55:18 crc kubenswrapper[4703]: I1011 03:55:18.159803 4703 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 11 03:55:18 crc kubenswrapper[4703]: I1011 03:55:18.159960 4703 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 11 03:55:18 crc kubenswrapper[4703]: I1011 03:55:18.160011 4703 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 11 03:55:18 crc kubenswrapper[4703]: I1011 03:55:18.161530 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 03:55:18 crc kubenswrapper[4703]: I1011 03:55:18.161580 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 03:55:18 crc kubenswrapper[4703]: I1011 03:55:18.161596 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 03:55:18 crc kubenswrapper[4703]: I1011 03:55:18.367669 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 11 03:55:18 crc kubenswrapper[4703]: I1011 03:55:18.496360 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 11 03:55:18 crc kubenswrapper[4703]: I1011 03:55:18.496628 4703 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 11 03:55:18 crc kubenswrapper[4703]: I1011 03:55:18.498255 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 03:55:18 crc kubenswrapper[4703]: I1011 03:55:18.498309 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 03:55:18 crc kubenswrapper[4703]: I1011 03:55:18.498326 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 03:55:18 crc kubenswrapper[4703]: I1011 03:55:18.613245 4703 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 11 03:55:18 crc kubenswrapper[4703]: I1011 03:55:18.613317 4703 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 11 03:55:18 crc kubenswrapper[4703]: I1011 03:55:18.614713 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 03:55:18 crc kubenswrapper[4703]: I1011 03:55:18.614781 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 03:55:18 crc kubenswrapper[4703]: I1011 03:55:18.614804 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 03:55:19 crc kubenswrapper[4703]: E1011 03:55:19.600429 4703 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Oct 11 03:55:19 crc kubenswrapper[4703]: I1011 03:55:19.772006 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Oct 11 03:55:19 crc kubenswrapper[4703]: I1011 03:55:19.772231 4703 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 11 03:55:19 crc kubenswrapper[4703]: I1011 03:55:19.773667 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 03:55:19 crc kubenswrapper[4703]: I1011 03:55:19.773722 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 03:55:19 crc kubenswrapper[4703]: I1011 03:55:19.773733 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 03:55:19 crc kubenswrapper[4703]: I1011 03:55:19.800871 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 11 03:55:19 crc kubenswrapper[4703]: I1011 03:55:19.801034 4703 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 11 03:55:19 crc kubenswrapper[4703]: I1011 03:55:19.802100 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 03:55:19 crc kubenswrapper[4703]: I1011 03:55:19.802146 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 03:55:19 crc kubenswrapper[4703]: I1011 03:55:19.802161 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 03:55:21 crc kubenswrapper[4703]: I1011 03:55:21.160169 4703 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Oct 11 03:55:21 crc kubenswrapper[4703]: I1011 03:55:21.160277 4703 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 11 03:55:22 crc kubenswrapper[4703]: I1011 03:55:22.830602 4703 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Liveness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Oct 11 03:55:22 crc kubenswrapper[4703]: I1011 03:55:22.830673 4703 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Oct 11 03:55:22 crc kubenswrapper[4703]: W1011 03:55:22.976603 4703 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout Oct 11 03:55:22 crc kubenswrapper[4703]: I1011 03:55:22.976691 4703 trace.go:236] Trace[1741035069]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (11-Oct-2025 03:55:12.975) (total time: 10000ms): Oct 11 03:55:22 crc kubenswrapper[4703]: Trace[1741035069]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10000ms (03:55:22.976) Oct 11 03:55:22 crc kubenswrapper[4703]: Trace[1741035069]: [10.000966165s] [10.000966165s] END Oct 11 03:55:22 crc kubenswrapper[4703]: E1011 03:55:22.976718 4703 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Oct 11 03:55:23 crc kubenswrapper[4703]: W1011 03:55:23.373193 4703 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout Oct 11 03:55:23 crc kubenswrapper[4703]: I1011 03:55:23.373326 4703 trace.go:236] Trace[103397849]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (11-Oct-2025 03:55:13.371) (total time: 10001ms): Oct 11 03:55:23 crc kubenswrapper[4703]: Trace[103397849]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (03:55:23.373) Oct 11 03:55:23 crc kubenswrapper[4703]: Trace[103397849]: [10.001516481s] [10.001516481s] END Oct 11 03:55:23 crc kubenswrapper[4703]: E1011 03:55:23.373359 4703 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Oct 11 03:55:23 crc kubenswrapper[4703]: I1011 03:55:23.389617 4703 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Oct 11 03:55:23 crc kubenswrapper[4703]: I1011 03:55:23.389731 4703 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Oct 11 03:55:23 crc kubenswrapper[4703]: I1011 03:55:23.398399 4703 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Oct 11 03:55:23 crc kubenswrapper[4703]: I1011 03:55:23.398503 4703 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Oct 11 03:55:25 crc kubenswrapper[4703]: I1011 03:55:25.826070 4703 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 11 03:55:25 crc kubenswrapper[4703]: I1011 03:55:25.826362 4703 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 11 03:55:25 crc kubenswrapper[4703]: I1011 03:55:25.828843 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 03:55:25 crc kubenswrapper[4703]: I1011 03:55:25.828915 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 03:55:25 crc kubenswrapper[4703]: I1011 03:55:25.828942 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 03:55:25 crc kubenswrapper[4703]: I1011 03:55:25.835005 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 11 03:55:26 crc kubenswrapper[4703]: I1011 03:55:26.630085 4703 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 11 03:55:26 crc kubenswrapper[4703]: I1011 03:55:26.630906 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 03:55:26 crc kubenswrapper[4703]: I1011 03:55:26.630934 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 03:55:26 crc kubenswrapper[4703]: I1011 03:55:26.630943 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 03:55:26 crc kubenswrapper[4703]: I1011 03:55:26.933907 4703 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Oct 11 03:55:26 crc kubenswrapper[4703]: I1011 03:55:26.934208 4703 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 11 03:55:26 crc kubenswrapper[4703]: I1011 03:55:26.936110 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 03:55:26 crc kubenswrapper[4703]: I1011 03:55:26.936165 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 03:55:26 crc kubenswrapper[4703]: I1011 03:55:26.936179 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 03:55:26 crc kubenswrapper[4703]: I1011 03:55:26.948740 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Oct 11 03:55:27 crc kubenswrapper[4703]: I1011 03:55:27.270516 4703 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Oct 11 03:55:27 crc kubenswrapper[4703]: I1011 03:55:27.632917 4703 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 11 03:55:27 crc kubenswrapper[4703]: I1011 03:55:27.634230 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 03:55:27 crc kubenswrapper[4703]: I1011 03:55:27.634304 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 03:55:27 crc kubenswrapper[4703]: I1011 03:55:27.634319 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 03:55:28 crc kubenswrapper[4703]: E1011 03:55:28.389877 4703 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": context deadline exceeded" interval="6.4s" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.393066 4703 trace.go:236] Trace[1386423104]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (11-Oct-2025 03:55:13.942) (total time: 14450ms): Oct 11 03:55:28 crc kubenswrapper[4703]: Trace[1386423104]: ---"Objects listed" error: 14450ms (03:55:28.392) Oct 11 03:55:28 crc kubenswrapper[4703]: Trace[1386423104]: [14.450240596s] [14.450240596s] END Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.393112 4703 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.406945 4703 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.407009 4703 trace.go:236] Trace[1444759721]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (11-Oct-2025 03:55:14.025) (total time: 14381ms): Oct 11 03:55:28 crc kubenswrapper[4703]: Trace[1444759721]: ---"Objects listed" error: 14381ms (03:55:28.406) Oct 11 03:55:28 crc kubenswrapper[4703]: Trace[1444759721]: [14.381598801s] [14.381598801s] END Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.407079 4703 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Oct 11 03:55:28 crc kubenswrapper[4703]: E1011 03:55:28.408268 4703 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.442099 4703 apiserver.go:52] "Watching apiserver" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.448631 4703 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.448918 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h"] Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.449558 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 11 03:55:28 crc kubenswrapper[4703]: E1011 03:55:28.449627 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.449691 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.449995 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 11 03:55:28 crc kubenswrapper[4703]: E1011 03:55:28.450039 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.450340 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 11 03:55:28 crc kubenswrapper[4703]: E1011 03:55:28.450480 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.450554 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.450631 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.452218 4703 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.453178 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.454956 4703 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:34818->192.168.126.11:17697: read: connection reset by peer" start-of-body= Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.455018 4703 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:34818->192.168.126.11:17697: read: connection reset by peer" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.455050 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.456125 4703 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.456177 4703 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.456492 4703 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.456531 4703 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.456589 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.457294 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.457906 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.458222 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.458390 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.458644 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.458814 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.468366 4703 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.476999 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.481937 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.493672 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.506583 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.508053 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.508144 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.508212 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.508250 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.508294 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.508333 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.508375 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.508426 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.508525 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.508564 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.508599 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.508638 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.508689 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.508731 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.508781 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.508827 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.508865 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.508924 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.508960 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.509015 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.509021 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.509074 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.509128 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.509177 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.509248 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.509298 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.509314 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.509359 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.509412 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.509450 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.509543 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.509595 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.509644 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.509696 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.509819 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.509881 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.509931 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.509979 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.510032 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.510103 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.510166 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.510215 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.510271 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.510324 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.510523 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.510575 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.510626 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.510674 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.510723 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.510771 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.511037 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.511097 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.511158 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.511209 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.511256 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.509484 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.509430 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.511305 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.511360 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.511411 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.511501 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.511550 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.511603 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.511653 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.511702 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.511750 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.511803 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.511855 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.511944 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.511996 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.512047 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.512093 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.512140 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.512190 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.512241 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.512291 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.512352 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.512401 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.512448 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.512617 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.512701 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.512762 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.512818 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.512870 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.512924 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.512978 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.513034 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.513089 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.513140 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.513190 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.513247 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.513303 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.513382 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.513435 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.513535 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.513589 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.513641 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.513693 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.513813 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.513868 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.513940 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.513998 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.514087 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.514147 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.514203 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.514261 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.514318 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.514374 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.514498 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.514557 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.514614 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.514670 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.514725 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.514778 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.514852 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.514907 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.514957 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.515010 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.515065 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.515159 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.515215 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.515257 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.515301 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.515343 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.515380 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.515418 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.515454 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.515524 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.515566 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.515600 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.515635 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.515672 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.515708 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.515745 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.515782 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.515818 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.515857 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.515905 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.515950 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.515989 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.516032 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.516066 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.516099 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.516142 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.516191 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.516227 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.516265 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.516301 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.516345 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.516379 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.516416 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.516453 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.516744 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.516796 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.516838 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.516875 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.516924 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.516974 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.517023 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.517073 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.517125 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.517189 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.517233 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.517268 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.517319 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.517358 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.509728 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.509793 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.509981 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.510004 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.510108 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.510237 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.510276 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.510533 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.510561 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.510715 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.510847 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.510871 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.511088 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.511111 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.511249 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.511347 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.511567 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.511838 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.512304 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.512802 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.513318 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.513665 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.514185 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.514204 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.514233 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.514658 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.514757 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.515104 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.515492 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.515572 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.515657 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.515989 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.516276 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.516529 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.516699 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.516746 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.516762 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.516935 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.517079 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.517201 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.517398 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.518103 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.518156 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.518196 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.518231 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.518269 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.518310 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.518350 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.518386 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.518422 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.518515 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.518553 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.518594 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.518631 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.518668 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.518707 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.518744 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.518786 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.518823 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.518859 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.518897 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.518932 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.518968 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.519067 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.519105 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.519150 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.519189 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.519229 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.519271 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.519308 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.519355 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.518542 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.526574 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.519082 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.519101 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.519200 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.519310 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.519325 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.519362 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.519393 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.519437 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.519633 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.519803 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.520248 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.520559 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.520656 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.520664 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.520879 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 03:55:28 crc kubenswrapper[4703]: E1011 03:55:28.521137 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-11 03:55:29.021112585 +0000 UTC m=+20.231594637 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.521103 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.521222 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.521240 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.521261 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.521636 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.521688 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.521747 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.521983 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.522111 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.522179 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.522323 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.522382 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.522860 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.523177 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.523177 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.523249 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.523335 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.523646 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.523696 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.523835 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.524127 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.524151 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.524341 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.524374 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.524428 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.524486 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.524611 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.524628 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.524852 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.525067 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.525085 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.525112 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.525225 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.525238 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.525340 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.525636 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.526033 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.526085 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.526121 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.526132 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.526369 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.526552 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.526612 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.526747 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.527578 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.527821 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.527869 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.528804 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.528876 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.528986 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.529162 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.529354 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.529725 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.529798 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.529196 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.530143 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.530161 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.530205 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.530234 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.530286 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.530290 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.530401 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.530513 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.530522 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.530833 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.530886 4703 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.530850 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.530922 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.532241 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.532275 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.532432 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.532515 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.532866 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.532865 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.533022 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.533082 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.533109 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.533383 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.533677 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.534210 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.534302 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.534303 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.534888 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.535362 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.535381 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 03:55:28 crc kubenswrapper[4703]: E1011 03:55:28.535535 4703 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.534159 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.536398 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.536450 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.536735 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.536751 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.536836 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.536826 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.537172 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.537204 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.537200 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.537266 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.537456 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.537562 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.537942 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.538881 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.538996 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.539230 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.539290 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.539312 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.539678 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.539860 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.538767 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.539917 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 03:55:28 crc kubenswrapper[4703]: E1011 03:55:28.537708 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-11 03:55:29.037681105 +0000 UTC m=+20.248163027 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.540054 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.540069 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.540189 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.540255 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.540263 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.540556 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.540669 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.540723 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.540673 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 03:55:28 crc kubenswrapper[4703]: E1011 03:55:28.540740 4703 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.540839 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 03:55:28 crc kubenswrapper[4703]: E1011 03:55:28.540889 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-11 03:55:29.04087261 +0000 UTC m=+20.251354542 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.540927 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.541009 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.541092 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.541184 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.541554 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.542011 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.543811 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.544603 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.545122 4703 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.545255 4703 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.545341 4703 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.545436 4703 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.544450 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.545707 4703 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.546040 4703 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.546122 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.546209 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.546293 4703 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.546371 4703 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.546447 4703 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.546646 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.546734 4703 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.546822 4703 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.546906 4703 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.547343 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.547990 4703 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.548071 4703 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.548146 4703 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.548245 4703 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.548323 4703 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.548402 4703 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.548504 4703 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.548596 4703 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.548681 4703 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.548790 4703 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.548904 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.549011 4703 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.549128 4703 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.549265 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.549616 4703 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.549730 4703 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.549808 4703 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.549888 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.550001 4703 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.550083 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.550168 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.550247 4703 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.550320 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.550405 4703 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.550516 4703 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.550602 4703 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.550696 4703 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.550774 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.550853 4703 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.550927 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.551013 4703 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.551099 4703 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.551179 4703 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.551253 4703 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.551326 4703 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.551407 4703 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.551508 4703 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.551604 4703 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.551686 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.551781 4703 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.552117 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.552237 4703 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.552629 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.552750 4703 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.553015 4703 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.553110 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.553193 4703 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.553283 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.553448 4703 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.547060 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.551751 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.551788 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 03:55:28 crc kubenswrapper[4703]: E1011 03:55:28.554260 4703 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 11 03:55:28 crc kubenswrapper[4703]: E1011 03:55:28.554452 4703 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 11 03:55:28 crc kubenswrapper[4703]: E1011 03:55:28.554516 4703 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.547612 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.554656 4703 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.554689 4703 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.554704 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Oct 11 03:55:28 crc kubenswrapper[4703]: E1011 03:55:28.554785 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-11 03:55:29.054761579 +0000 UTC m=+20.265243501 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.551611 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.554831 4703 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.554857 4703 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.554874 4703 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.554897 4703 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.555363 4703 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.555381 4703 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.555395 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.555414 4703 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.555426 4703 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.555442 4703 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.555454 4703 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.555489 4703 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.555502 4703 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.555515 4703 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.555528 4703 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.555543 4703 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.555555 4703 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.555566 4703 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.555581 4703 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.555593 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.555604 4703 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.555615 4703 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.555630 4703 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.555806 4703 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.555823 4703 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.555938 4703 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.555962 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.555978 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.555993 4703 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.556010 4703 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.556023 4703 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.556036 4703 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.556051 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.556068 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.556081 4703 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.556093 4703 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.556107 4703 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.556123 4703 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.556136 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.556150 4703 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.556162 4703 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.556177 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.556190 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.556203 4703 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.556218 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.556229 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.556239 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.556248 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.556261 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.556270 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.556280 4703 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.556289 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.556319 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.556334 4703 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.556349 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.556366 4703 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.556379 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.556394 4703 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.556407 4703 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.556454 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.556481 4703 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.556493 4703 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.556503 4703 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.556516 4703 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.556525 4703 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.556558 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.556572 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.556588 4703 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.556600 4703 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.559998 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.560022 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.560035 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.560047 4703 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.560065 4703 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.560084 4703 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.560096 4703 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.560107 4703 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.560120 4703 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.560138 4703 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.560521 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.560539 4703 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.560559 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.560849 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.560864 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.560880 4703 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.560973 4703 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.560985 4703 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.561001 4703 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.561013 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.561030 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.561043 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.561055 4703 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.561067 4703 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.561085 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.561097 4703 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.561110 4703 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.561126 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.561139 4703 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.561150 4703 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.561164 4703 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.561180 4703 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.561644 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.563486 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.564223 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.564566 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.565694 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.568964 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.569898 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.571439 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 11 03:55:28 crc kubenswrapper[4703]: E1011 03:55:28.573859 4703 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 11 03:55:28 crc kubenswrapper[4703]: E1011 03:55:28.573891 4703 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 11 03:55:28 crc kubenswrapper[4703]: E1011 03:55:28.573906 4703 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 11 03:55:28 crc kubenswrapper[4703]: E1011 03:55:28.573971 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-11 03:55:29.073948609 +0000 UTC m=+20.284430531 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.574084 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.574714 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.575049 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.575575 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.575731 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.576022 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.576097 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.577353 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.577500 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.577619 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.577752 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.578389 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.578633 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.579160 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.579980 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.582073 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.595284 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.603639 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.607191 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.620891 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.621300 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7905d36e-5e82-411f-bbff-19fbde136cb3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c5b679c77d5f5c431108f51cdeecf410bd4f6f76c74c2580cd3c39898eec23e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8a0a051a04bd848843a5b40337140b947c17d8378b0efbcef7d507ecadcd40b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6798d26a9f6823db8508369a2228601bbce6ce6f60799acc543e3a9074321130\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4c454ab3f6ac3f4ddf483a0039e0ddcff628cc078dd736875b86becc2f51c0d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T03:55:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.624502 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.626163 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.632257 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.632294 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.662795 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.662878 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.662830 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.663110 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.663208 4703 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.663222 4703 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.663237 4703 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.663250 4703 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.663262 4703 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.663272 4703 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.663283 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.663293 4703 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.663303 4703 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.663315 4703 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.663325 4703 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.663336 4703 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.663348 4703 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.663361 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.663373 4703 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.663384 4703 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.663396 4703 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.663406 4703 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.663418 4703 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.663429 4703 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.663440 4703 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.663453 4703 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.663496 4703 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.663510 4703 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.663522 4703 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.663687 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.672831 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.674926 4703 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="5bdb20db3996ab197ac02919be77d320bf5f7ef288276b9f0788963b19a3d818" exitCode=255 Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.675095 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"5bdb20db3996ab197ac02919be77d320bf5f7ef288276b9f0788963b19a3d818"} Oct 11 03:55:28 crc kubenswrapper[4703]: E1011 03:55:28.684177 4703 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"kube-controller-manager-crc\" already exists" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.685417 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.687269 4703 scope.go:117] "RemoveContainer" containerID="5bdb20db3996ab197ac02919be77d320bf5f7ef288276b9f0788963b19a3d818" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.688793 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.702129 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7905d36e-5e82-411f-bbff-19fbde136cb3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c5b679c77d5f5c431108f51cdeecf410bd4f6f76c74c2580cd3c39898eec23e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8a0a051a04bd848843a5b40337140b947c17d8378b0efbcef7d507ecadcd40b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6798d26a9f6823db8508369a2228601bbce6ce6f60799acc543e3a9074321130\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4c454ab3f6ac3f4ddf483a0039e0ddcff628cc078dd736875b86becc2f51c0d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T03:55:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.715070 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.729729 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.741088 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.755560 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.763636 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.765373 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.773674 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.775538 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 11 03:55:28 crc kubenswrapper[4703]: W1011 03:55:28.781115 4703 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37a5e44f_9a88_4405_be8a_b645485e7312.slice/crio-88d347c4f7845df8f2987e19aaa4c607d98c4af48a6c37f05a5eb1e5e66fa627 WatchSource:0}: Error finding container 88d347c4f7845df8f2987e19aaa4c607d98c4af48a6c37f05a5eb1e5e66fa627: Status 404 returned error can't find the container with id 88d347c4f7845df8f2987e19aaa4c607d98c4af48a6c37f05a5eb1e5e66fa627 Oct 11 03:55:28 crc kubenswrapper[4703]: I1011 03:55:28.784657 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 11 03:55:28 crc kubenswrapper[4703]: E1011 03:55:28.786420 4703 kuberuntime_manager.go:1274] "Unhandled Error" err=< Oct 11 03:55:28 crc kubenswrapper[4703]: container &Container{Name:network-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,Command:[/bin/bash -c #!/bin/bash Oct 11 03:55:28 crc kubenswrapper[4703]: set -o allexport Oct 11 03:55:28 crc kubenswrapper[4703]: if [[ -f /etc/kubernetes/apiserver-url.env ]]; then Oct 11 03:55:28 crc kubenswrapper[4703]: source /etc/kubernetes/apiserver-url.env Oct 11 03:55:28 crc kubenswrapper[4703]: else Oct 11 03:55:28 crc kubenswrapper[4703]: echo "Error: /etc/kubernetes/apiserver-url.env is missing" Oct 11 03:55:28 crc kubenswrapper[4703]: exit 1 Oct 11 03:55:28 crc kubenswrapper[4703]: fi Oct 11 03:55:28 crc kubenswrapper[4703]: exec /usr/bin/cluster-network-operator start --listen=0.0.0.0:9104 Oct 11 03:55:28 crc kubenswrapper[4703]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:cno,HostPort:9104,ContainerPort:9104,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:RELEASE_VERSION,Value:4.18.1,ValueFrom:nil,},EnvVar{Name:KUBE_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b97554198294bf544fbc116c94a0a1fb2ec8a4de0e926bf9d9e320135f0bee6f,ValueFrom:nil,},EnvVar{Name:KUBE_RBAC_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,ValueFrom:nil,},EnvVar{Name:MULTUS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,ValueFrom:nil,},EnvVar{Name:MULTUS_ADMISSION_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317,ValueFrom:nil,},EnvVar{Name:CNI_PLUGINS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc,ValueFrom:nil,},EnvVar{Name:BOND_CNI_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78,ValueFrom:nil,},EnvVar{Name:WHEREABOUTS_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4,ValueFrom:nil,},EnvVar{Name:ROUTE_OVERRRIDE_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa,ValueFrom:nil,},EnvVar{Name:MULTUS_NETWORKPOLICY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:23f833d3738d68706eb2f2868bd76bd71cee016cffa6faf5f045a60cc8c6eddd,ValueFrom:nil,},EnvVar{Name:OVN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,ValueFrom:nil,},EnvVar{Name:OVN_NB_RAFT_ELECTION_TIMER,Value:10,ValueFrom:nil,},EnvVar{Name:OVN_SB_RAFT_ELECTION_TIMER,Value:16,ValueFrom:nil,},EnvVar{Name:OVN_NORTHD_PROBE_INTERVAL,Value:10000,ValueFrom:nil,},EnvVar{Name:OVN_CONTROLLER_INACTIVITY_PROBE,Value:180000,ValueFrom:nil,},EnvVar{Name:OVN_NB_INACTIVITY_PROBE,Value:60000,ValueFrom:nil,},EnvVar{Name:EGRESS_ROUTER_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,ValueFrom:nil,},EnvVar{Name:NETWORK_METRICS_DAEMON_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_SOURCE_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_TARGET_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:CLOUD_NETWORK_CONFIG_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8048f1cb0be521f09749c0a489503cd56d85b68c6ca93380e082cfd693cd97a8,ValueFrom:nil,},EnvVar{Name:CLI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,ValueFrom:nil,},EnvVar{Name:FRR_K8S_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5dbf844e49bb46b78586930149e5e5f5dc121014c8afd10fe36f3651967cc256,ValueFrom:nil,},EnvVar{Name:NETWORKING_CONSOLE_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host-etc-kube,ReadOnly:true,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-tls,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rdwmf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-operator-58b4c7f79c-55gtf_openshift-network-operator(37a5e44f-9a88-4405-be8a-b645485e7312): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Oct 11 03:55:28 crc kubenswrapper[4703]: > logger="UnhandledError" Oct 11 03:55:28 crc kubenswrapper[4703]: E1011 03:55:28.787618 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"network-operator\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" podUID="37a5e44f-9a88-4405-be8a-b645485e7312" Oct 11 03:55:28 crc kubenswrapper[4703]: W1011 03:55:28.795026 4703 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-24abffae55f269dbc0bb0a4fdef695e5406e09df6b5bdf6371786f27049169d3 WatchSource:0}: Error finding container 24abffae55f269dbc0bb0a4fdef695e5406e09df6b5bdf6371786f27049169d3: Status 404 returned error can't find the container with id 24abffae55f269dbc0bb0a4fdef695e5406e09df6b5bdf6371786f27049169d3 Oct 11 03:55:28 crc kubenswrapper[4703]: W1011 03:55:28.799612 4703 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-c14092188fdd002f60b36319e47c21d0bd1333d75f3e8d01cf55a2605fa12916 WatchSource:0}: Error finding container c14092188fdd002f60b36319e47c21d0bd1333d75f3e8d01cf55a2605fa12916: Status 404 returned error can't find the container with id c14092188fdd002f60b36319e47c21d0bd1333d75f3e8d01cf55a2605fa12916 Oct 11 03:55:28 crc kubenswrapper[4703]: E1011 03:55:28.800247 4703 kuberuntime_manager.go:1274] "Unhandled Error" err=< Oct 11 03:55:28 crc kubenswrapper[4703]: container &Container{Name:webhook,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Oct 11 03:55:28 crc kubenswrapper[4703]: if [[ -f "/env/_master" ]]; then Oct 11 03:55:28 crc kubenswrapper[4703]: set -o allexport Oct 11 03:55:28 crc kubenswrapper[4703]: source "/env/_master" Oct 11 03:55:28 crc kubenswrapper[4703]: set +o allexport Oct 11 03:55:28 crc kubenswrapper[4703]: fi Oct 11 03:55:28 crc kubenswrapper[4703]: # OVN-K will try to remove hybrid overlay node annotations even when the hybrid overlay is not enabled. Oct 11 03:55:28 crc kubenswrapper[4703]: # https://github.com/ovn-org/ovn-kubernetes/blob/ac6820df0b338a246f10f412cd5ec903bd234694/go-controller/pkg/ovn/master.go#L791 Oct 11 03:55:28 crc kubenswrapper[4703]: ho_enable="--enable-hybrid-overlay" Oct 11 03:55:28 crc kubenswrapper[4703]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start webhook" Oct 11 03:55:28 crc kubenswrapper[4703]: # extra-allowed-user: service account `ovn-kubernetes-control-plane` Oct 11 03:55:28 crc kubenswrapper[4703]: # sets pod annotations in multi-homing layer3 network controller (cluster-manager) Oct 11 03:55:28 crc kubenswrapper[4703]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Oct 11 03:55:28 crc kubenswrapper[4703]: --webhook-cert-dir="/etc/webhook-cert" \ Oct 11 03:55:28 crc kubenswrapper[4703]: --webhook-host=127.0.0.1 \ Oct 11 03:55:28 crc kubenswrapper[4703]: --webhook-port=9743 \ Oct 11 03:55:28 crc kubenswrapper[4703]: ${ho_enable} \ Oct 11 03:55:28 crc kubenswrapper[4703]: --enable-interconnect \ Oct 11 03:55:28 crc kubenswrapper[4703]: --disable-approver \ Oct 11 03:55:28 crc kubenswrapper[4703]: --extra-allowed-user="system:serviceaccount:openshift-ovn-kubernetes:ovn-kubernetes-control-plane" \ Oct 11 03:55:28 crc kubenswrapper[4703]: --wait-for-kubernetes-api=200s \ Oct 11 03:55:28 crc kubenswrapper[4703]: --pod-admission-conditions="/var/run/ovnkube-identity-config/additional-pod-admission-cond.json" \ Oct 11 03:55:28 crc kubenswrapper[4703]: --loglevel="${LOGLEVEL}" Oct 11 03:55:28 crc kubenswrapper[4703]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:2,ValueFrom:nil,},EnvVar{Name:KUBERNETES_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/etc/webhook-cert/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Oct 11 03:55:28 crc kubenswrapper[4703]: > logger="UnhandledError" Oct 11 03:55:28 crc kubenswrapper[4703]: E1011 03:55:28.802192 4703 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:iptables-alerter,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/iptables-alerter/iptables-alerter.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONTAINER_RUNTIME_ENDPOINT,Value:unix:///run/crio/crio.sock,ValueFrom:nil,},EnvVar{Name:ALERTER_POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:iptables-alerter-script,ReadOnly:false,MountPath:/iptables-alerter,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-slash,ReadOnly:true,MountPath:/host,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rczfb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod iptables-alerter-4ln5h_openshift-network-operator(d75a4c96-2883-4a0b-bab2-0fab2b6c0b49): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Oct 11 03:55:28 crc kubenswrapper[4703]: E1011 03:55:28.803073 4703 kuberuntime_manager.go:1274] "Unhandled Error" err=< Oct 11 03:55:28 crc kubenswrapper[4703]: container &Container{Name:approver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Oct 11 03:55:28 crc kubenswrapper[4703]: if [[ -f "/env/_master" ]]; then Oct 11 03:55:28 crc kubenswrapper[4703]: set -o allexport Oct 11 03:55:28 crc kubenswrapper[4703]: source "/env/_master" Oct 11 03:55:28 crc kubenswrapper[4703]: set +o allexport Oct 11 03:55:28 crc kubenswrapper[4703]: fi Oct 11 03:55:28 crc kubenswrapper[4703]: Oct 11 03:55:28 crc kubenswrapper[4703]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start approver" Oct 11 03:55:28 crc kubenswrapper[4703]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Oct 11 03:55:28 crc kubenswrapper[4703]: --disable-webhook \ Oct 11 03:55:28 crc kubenswrapper[4703]: --csr-acceptance-conditions="/var/run/ovnkube-identity-config/additional-cert-acceptance-cond.json" \ Oct 11 03:55:28 crc kubenswrapper[4703]: --loglevel="${LOGLEVEL}" Oct 11 03:55:28 crc kubenswrapper[4703]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:4,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Oct 11 03:55:28 crc kubenswrapper[4703]: > logger="UnhandledError" Oct 11 03:55:28 crc kubenswrapper[4703]: E1011 03:55:28.803513 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"iptables-alerter\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/iptables-alerter-4ln5h" podUID="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" Oct 11 03:55:28 crc kubenswrapper[4703]: E1011 03:55:28.804702 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"webhook\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"approver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-network-node-identity/network-node-identity-vrzqb" podUID="ef543e1b-8068-4ea3-b32a-61027b32e95d" Oct 11 03:55:29 crc kubenswrapper[4703]: I1011 03:55:29.066075 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 11 03:55:29 crc kubenswrapper[4703]: I1011 03:55:29.066159 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 11 03:55:29 crc kubenswrapper[4703]: I1011 03:55:29.066184 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 11 03:55:29 crc kubenswrapper[4703]: I1011 03:55:29.066203 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 11 03:55:29 crc kubenswrapper[4703]: E1011 03:55:29.066279 4703 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 11 03:55:29 crc kubenswrapper[4703]: E1011 03:55:29.066338 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-11 03:55:30.066321889 +0000 UTC m=+21.276803811 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 11 03:55:29 crc kubenswrapper[4703]: E1011 03:55:29.066396 4703 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 11 03:55:29 crc kubenswrapper[4703]: E1011 03:55:29.066413 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-11 03:55:30.06637801 +0000 UTC m=+21.276859942 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 11 03:55:29 crc kubenswrapper[4703]: E1011 03:55:29.066444 4703 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 11 03:55:29 crc kubenswrapper[4703]: E1011 03:55:29.066652 4703 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 11 03:55:29 crc kubenswrapper[4703]: E1011 03:55:29.066670 4703 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 11 03:55:29 crc kubenswrapper[4703]: E1011 03:55:29.066610 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-11 03:55:30.066598996 +0000 UTC m=+21.277080918 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 11 03:55:29 crc kubenswrapper[4703]: E1011 03:55:29.066735 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-11 03:55:30.066724789 +0000 UTC m=+21.277206721 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 11 03:55:29 crc kubenswrapper[4703]: I1011 03:55:29.167066 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 11 03:55:29 crc kubenswrapper[4703]: E1011 03:55:29.167268 4703 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 11 03:55:29 crc kubenswrapper[4703]: E1011 03:55:29.167516 4703 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 11 03:55:29 crc kubenswrapper[4703]: E1011 03:55:29.167527 4703 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 11 03:55:29 crc kubenswrapper[4703]: E1011 03:55:29.167595 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-11 03:55:30.16757467 +0000 UTC m=+21.378056592 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 11 03:55:29 crc kubenswrapper[4703]: I1011 03:55:29.537829 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Oct 11 03:55:29 crc kubenswrapper[4703]: I1011 03:55:29.539108 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Oct 11 03:55:29 crc kubenswrapper[4703]: I1011 03:55:29.541512 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Oct 11 03:55:29 crc kubenswrapper[4703]: I1011 03:55:29.543093 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Oct 11 03:55:29 crc kubenswrapper[4703]: I1011 03:55:29.545081 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Oct 11 03:55:29 crc kubenswrapper[4703]: I1011 03:55:29.545851 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Oct 11 03:55:29 crc kubenswrapper[4703]: I1011 03:55:29.546740 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Oct 11 03:55:29 crc kubenswrapper[4703]: I1011 03:55:29.546675 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7905d36e-5e82-411f-bbff-19fbde136cb3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c5b679c77d5f5c431108f51cdeecf410bd4f6f76c74c2580cd3c39898eec23e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8a0a051a04bd848843a5b40337140b947c17d8378b0efbcef7d507ecadcd40b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6798d26a9f6823db8508369a2228601bbce6ce6f60799acc543e3a9074321130\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4c454ab3f6ac3f4ddf483a0039e0ddcff628cc078dd736875b86becc2f51c0d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T03:55:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:29 crc kubenswrapper[4703]: I1011 03:55:29.547658 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Oct 11 03:55:29 crc kubenswrapper[4703]: I1011 03:55:29.548496 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Oct 11 03:55:29 crc kubenswrapper[4703]: I1011 03:55:29.549143 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Oct 11 03:55:29 crc kubenswrapper[4703]: I1011 03:55:29.549875 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Oct 11 03:55:29 crc kubenswrapper[4703]: I1011 03:55:29.551702 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Oct 11 03:55:29 crc kubenswrapper[4703]: I1011 03:55:29.553365 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Oct 11 03:55:29 crc kubenswrapper[4703]: I1011 03:55:29.554901 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Oct 11 03:55:29 crc kubenswrapper[4703]: I1011 03:55:29.557412 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Oct 11 03:55:29 crc kubenswrapper[4703]: I1011 03:55:29.558879 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Oct 11 03:55:29 crc kubenswrapper[4703]: I1011 03:55:29.561537 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Oct 11 03:55:29 crc kubenswrapper[4703]: I1011 03:55:29.562695 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Oct 11 03:55:29 crc kubenswrapper[4703]: I1011 03:55:29.564279 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Oct 11 03:55:29 crc kubenswrapper[4703]: I1011 03:55:29.566871 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Oct 11 03:55:29 crc kubenswrapper[4703]: I1011 03:55:29.568151 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Oct 11 03:55:29 crc kubenswrapper[4703]: I1011 03:55:29.568187 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:29 crc kubenswrapper[4703]: I1011 03:55:29.570165 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Oct 11 03:55:29 crc kubenswrapper[4703]: I1011 03:55:29.570944 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Oct 11 03:55:29 crc kubenswrapper[4703]: I1011 03:55:29.573018 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Oct 11 03:55:29 crc kubenswrapper[4703]: I1011 03:55:29.575012 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Oct 11 03:55:29 crc kubenswrapper[4703]: I1011 03:55:29.576702 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Oct 11 03:55:29 crc kubenswrapper[4703]: I1011 03:55:29.578352 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Oct 11 03:55:29 crc kubenswrapper[4703]: I1011 03:55:29.579649 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Oct 11 03:55:29 crc kubenswrapper[4703]: I1011 03:55:29.580974 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Oct 11 03:55:29 crc kubenswrapper[4703]: I1011 03:55:29.582082 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Oct 11 03:55:29 crc kubenswrapper[4703]: I1011 03:55:29.582295 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:29 crc kubenswrapper[4703]: I1011 03:55:29.583098 4703 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Oct 11 03:55:29 crc kubenswrapper[4703]: I1011 03:55:29.583360 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Oct 11 03:55:29 crc kubenswrapper[4703]: I1011 03:55:29.587629 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Oct 11 03:55:29 crc kubenswrapper[4703]: I1011 03:55:29.588510 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Oct 11 03:55:29 crc kubenswrapper[4703]: I1011 03:55:29.590428 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Oct 11 03:55:29 crc kubenswrapper[4703]: I1011 03:55:29.592956 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Oct 11 03:55:29 crc kubenswrapper[4703]: I1011 03:55:29.593770 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Oct 11 03:55:29 crc kubenswrapper[4703]: I1011 03:55:29.596031 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Oct 11 03:55:29 crc kubenswrapper[4703]: I1011 03:55:29.597226 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Oct 11 03:55:29 crc kubenswrapper[4703]: I1011 03:55:29.598266 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Oct 11 03:55:29 crc kubenswrapper[4703]: I1011 03:55:29.598994 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Oct 11 03:55:29 crc kubenswrapper[4703]: I1011 03:55:29.600050 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Oct 11 03:55:29 crc kubenswrapper[4703]: I1011 03:55:29.600336 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:29 crc kubenswrapper[4703]: I1011 03:55:29.601092 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Oct 11 03:55:29 crc kubenswrapper[4703]: I1011 03:55:29.602084 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Oct 11 03:55:29 crc kubenswrapper[4703]: I1011 03:55:29.602960 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Oct 11 03:55:29 crc kubenswrapper[4703]: I1011 03:55:29.605934 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Oct 11 03:55:29 crc kubenswrapper[4703]: I1011 03:55:29.607501 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Oct 11 03:55:29 crc kubenswrapper[4703]: I1011 03:55:29.610575 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Oct 11 03:55:29 crc kubenswrapper[4703]: I1011 03:55:29.611742 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Oct 11 03:55:29 crc kubenswrapper[4703]: I1011 03:55:29.613631 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Oct 11 03:55:29 crc kubenswrapper[4703]: I1011 03:55:29.614609 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Oct 11 03:55:29 crc kubenswrapper[4703]: I1011 03:55:29.615784 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Oct 11 03:55:29 crc kubenswrapper[4703]: I1011 03:55:29.617792 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Oct 11 03:55:29 crc kubenswrapper[4703]: I1011 03:55:29.618779 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Oct 11 03:55:29 crc kubenswrapper[4703]: I1011 03:55:29.620354 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"899c5710-ccac-41a7-a51e-f13015a9b925\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d36cda809e465ae3cdbebd04ca1d0c880d982920b9e17e237e1c02da638b4b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa3ac63ddb093f123aeccd29dda14b9d42e3196818fe255e4315b2fd81e4d4bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://337d11e16f135a7996228291aa5aeb9dca3b0743c8d53ad358ad75a3f6ac604f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bdb20db3996ab197ac02919be77d320bf5f7ef288276b9f0788963b19a3d818\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bdb20db3996ab197ac02919be77d320bf5f7ef288276b9f0788963b19a3d818\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1011 03:55:23.042098 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1011 03:55:23.055510 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1583286593/tls.crt::/tmp/serving-cert-1583286593/tls.key\\\\\\\"\\\\nI1011 03:55:28.420513 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1011 03:55:28.424215 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1011 03:55:28.424249 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1011 03:55:28.424289 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1011 03:55:28.424299 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1011 03:55:28.439370 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1011 03:55:28.439543 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1011 03:55:28.439608 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1011 03:55:28.439657 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1011 03:55:28.439755 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1011 03:55:28.439805 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1011 03:55:28.439852 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1011 03:55:28.439400 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1011 03:55:28.443344 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-11T03:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6665fdb421f1acf4f7a285ccb1adcdca9e7b6191f21ed8dd1ebfa21771a0cbc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbcccfdee4565b57a78cd594f8e68b9ee6f8afb6d329818c54058ef262ae1aa7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fbcccfdee4565b57a78cd594f8e68b9ee6f8afb6d329818c54058ef262ae1aa7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T03:55:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T03:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T03:55:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:29 crc kubenswrapper[4703]: I1011 03:55:29.635778 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:29 crc kubenswrapper[4703]: I1011 03:55:29.647303 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:29 crc kubenswrapper[4703]: I1011 03:55:29.662329 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:29 crc kubenswrapper[4703]: I1011 03:55:29.681074 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"c14092188fdd002f60b36319e47c21d0bd1333d75f3e8d01cf55a2605fa12916"} Oct 11 03:55:29 crc kubenswrapper[4703]: I1011 03:55:29.682400 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"24abffae55f269dbc0bb0a4fdef695e5406e09df6b5bdf6371786f27049169d3"} Oct 11 03:55:29 crc kubenswrapper[4703]: I1011 03:55:29.683583 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"88d347c4f7845df8f2987e19aaa4c607d98c4af48a6c37f05a5eb1e5e66fa627"} Oct 11 03:55:29 crc kubenswrapper[4703]: E1011 03:55:29.684342 4703 kuberuntime_manager.go:1274] "Unhandled Error" err=< Oct 11 03:55:29 crc kubenswrapper[4703]: container &Container{Name:webhook,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Oct 11 03:55:29 crc kubenswrapper[4703]: if [[ -f "/env/_master" ]]; then Oct 11 03:55:29 crc kubenswrapper[4703]: set -o allexport Oct 11 03:55:29 crc kubenswrapper[4703]: source "/env/_master" Oct 11 03:55:29 crc kubenswrapper[4703]: set +o allexport Oct 11 03:55:29 crc kubenswrapper[4703]: fi Oct 11 03:55:29 crc kubenswrapper[4703]: # OVN-K will try to remove hybrid overlay node annotations even when the hybrid overlay is not enabled. Oct 11 03:55:29 crc kubenswrapper[4703]: # https://github.com/ovn-org/ovn-kubernetes/blob/ac6820df0b338a246f10f412cd5ec903bd234694/go-controller/pkg/ovn/master.go#L791 Oct 11 03:55:29 crc kubenswrapper[4703]: ho_enable="--enable-hybrid-overlay" Oct 11 03:55:29 crc kubenswrapper[4703]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start webhook" Oct 11 03:55:29 crc kubenswrapper[4703]: # extra-allowed-user: service account `ovn-kubernetes-control-plane` Oct 11 03:55:29 crc kubenswrapper[4703]: # sets pod annotations in multi-homing layer3 network controller (cluster-manager) Oct 11 03:55:29 crc kubenswrapper[4703]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Oct 11 03:55:29 crc kubenswrapper[4703]: --webhook-cert-dir="/etc/webhook-cert" \ Oct 11 03:55:29 crc kubenswrapper[4703]: --webhook-host=127.0.0.1 \ Oct 11 03:55:29 crc kubenswrapper[4703]: --webhook-port=9743 \ Oct 11 03:55:29 crc kubenswrapper[4703]: ${ho_enable} \ Oct 11 03:55:29 crc kubenswrapper[4703]: --enable-interconnect \ Oct 11 03:55:29 crc kubenswrapper[4703]: --disable-approver \ Oct 11 03:55:29 crc kubenswrapper[4703]: --extra-allowed-user="system:serviceaccount:openshift-ovn-kubernetes:ovn-kubernetes-control-plane" \ Oct 11 03:55:29 crc kubenswrapper[4703]: --wait-for-kubernetes-api=200s \ Oct 11 03:55:29 crc kubenswrapper[4703]: --pod-admission-conditions="/var/run/ovnkube-identity-config/additional-pod-admission-cond.json" \ Oct 11 03:55:29 crc kubenswrapper[4703]: --loglevel="${LOGLEVEL}" Oct 11 03:55:29 crc kubenswrapper[4703]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:2,ValueFrom:nil,},EnvVar{Name:KUBERNETES_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/etc/webhook-cert/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Oct 11 03:55:29 crc kubenswrapper[4703]: > logger="UnhandledError" Oct 11 03:55:29 crc kubenswrapper[4703]: E1011 03:55:29.685233 4703 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:iptables-alerter,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/iptables-alerter/iptables-alerter.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONTAINER_RUNTIME_ENDPOINT,Value:unix:///run/crio/crio.sock,ValueFrom:nil,},EnvVar{Name:ALERTER_POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:iptables-alerter-script,ReadOnly:false,MountPath:/iptables-alerter,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-slash,ReadOnly:true,MountPath:/host,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rczfb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod iptables-alerter-4ln5h_openshift-network-operator(d75a4c96-2883-4a0b-bab2-0fab2b6c0b49): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Oct 11 03:55:29 crc kubenswrapper[4703]: E1011 03:55:29.685770 4703 kuberuntime_manager.go:1274] "Unhandled Error" err=< Oct 11 03:55:29 crc kubenswrapper[4703]: container &Container{Name:network-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,Command:[/bin/bash -c #!/bin/bash Oct 11 03:55:29 crc kubenswrapper[4703]: set -o allexport Oct 11 03:55:29 crc kubenswrapper[4703]: if [[ -f /etc/kubernetes/apiserver-url.env ]]; then Oct 11 03:55:29 crc kubenswrapper[4703]: source /etc/kubernetes/apiserver-url.env Oct 11 03:55:29 crc kubenswrapper[4703]: else Oct 11 03:55:29 crc kubenswrapper[4703]: echo "Error: /etc/kubernetes/apiserver-url.env is missing" Oct 11 03:55:29 crc kubenswrapper[4703]: exit 1 Oct 11 03:55:29 crc kubenswrapper[4703]: fi Oct 11 03:55:29 crc kubenswrapper[4703]: exec /usr/bin/cluster-network-operator start --listen=0.0.0.0:9104 Oct 11 03:55:29 crc kubenswrapper[4703]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:cno,HostPort:9104,ContainerPort:9104,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:RELEASE_VERSION,Value:4.18.1,ValueFrom:nil,},EnvVar{Name:KUBE_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b97554198294bf544fbc116c94a0a1fb2ec8a4de0e926bf9d9e320135f0bee6f,ValueFrom:nil,},EnvVar{Name:KUBE_RBAC_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,ValueFrom:nil,},EnvVar{Name:MULTUS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,ValueFrom:nil,},EnvVar{Name:MULTUS_ADMISSION_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317,ValueFrom:nil,},EnvVar{Name:CNI_PLUGINS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc,ValueFrom:nil,},EnvVar{Name:BOND_CNI_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78,ValueFrom:nil,},EnvVar{Name:WHEREABOUTS_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4,ValueFrom:nil,},EnvVar{Name:ROUTE_OVERRRIDE_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa,ValueFrom:nil,},EnvVar{Name:MULTUS_NETWORKPOLICY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:23f833d3738d68706eb2f2868bd76bd71cee016cffa6faf5f045a60cc8c6eddd,ValueFrom:nil,},EnvVar{Name:OVN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,ValueFrom:nil,},EnvVar{Name:OVN_NB_RAFT_ELECTION_TIMER,Value:10,ValueFrom:nil,},EnvVar{Name:OVN_SB_RAFT_ELECTION_TIMER,Value:16,ValueFrom:nil,},EnvVar{Name:OVN_NORTHD_PROBE_INTERVAL,Value:10000,ValueFrom:nil,},EnvVar{Name:OVN_CONTROLLER_INACTIVITY_PROBE,Value:180000,ValueFrom:nil,},EnvVar{Name:OVN_NB_INACTIVITY_PROBE,Value:60000,ValueFrom:nil,},EnvVar{Name:EGRESS_ROUTER_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,ValueFrom:nil,},EnvVar{Name:NETWORK_METRICS_DAEMON_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_SOURCE_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_TARGET_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:CLOUD_NETWORK_CONFIG_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8048f1cb0be521f09749c0a489503cd56d85b68c6ca93380e082cfd693cd97a8,ValueFrom:nil,},EnvVar{Name:CLI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,ValueFrom:nil,},EnvVar{Name:FRR_K8S_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5dbf844e49bb46b78586930149e5e5f5dc121014c8afd10fe36f3651967cc256,ValueFrom:nil,},EnvVar{Name:NETWORKING_CONSOLE_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host-etc-kube,ReadOnly:true,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-tls,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rdwmf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-operator-58b4c7f79c-55gtf_openshift-network-operator(37a5e44f-9a88-4405-be8a-b645485e7312): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Oct 11 03:55:29 crc kubenswrapper[4703]: > logger="UnhandledError" Oct 11 03:55:29 crc kubenswrapper[4703]: E1011 03:55:29.686560 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"iptables-alerter\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/iptables-alerter-4ln5h" podUID="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" Oct 11 03:55:29 crc kubenswrapper[4703]: E1011 03:55:29.687741 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"network-operator\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" podUID="37a5e44f-9a88-4405-be8a-b645485e7312" Oct 11 03:55:29 crc kubenswrapper[4703]: I1011 03:55:29.703968 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:29 crc kubenswrapper[4703]: I1011 03:55:29.704151 4703 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Oct 11 03:55:29 crc kubenswrapper[4703]: E1011 03:55:29.704180 4703 kuberuntime_manager.go:1274] "Unhandled Error" err=< Oct 11 03:55:29 crc kubenswrapper[4703]: container &Container{Name:approver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Oct 11 03:55:29 crc kubenswrapper[4703]: if [[ -f "/env/_master" ]]; then Oct 11 03:55:29 crc kubenswrapper[4703]: set -o allexport Oct 11 03:55:29 crc kubenswrapper[4703]: source "/env/_master" Oct 11 03:55:29 crc kubenswrapper[4703]: set +o allexport Oct 11 03:55:29 crc kubenswrapper[4703]: fi Oct 11 03:55:29 crc kubenswrapper[4703]: Oct 11 03:55:29 crc kubenswrapper[4703]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start approver" Oct 11 03:55:29 crc kubenswrapper[4703]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Oct 11 03:55:29 crc kubenswrapper[4703]: --disable-webhook \ Oct 11 03:55:29 crc kubenswrapper[4703]: --csr-acceptance-conditions="/var/run/ovnkube-identity-config/additional-cert-acceptance-cond.json" \ Oct 11 03:55:29 crc kubenswrapper[4703]: --loglevel="${LOGLEVEL}" Oct 11 03:55:29 crc kubenswrapper[4703]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:4,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Oct 11 03:55:29 crc kubenswrapper[4703]: > logger="UnhandledError" Oct 11 03:55:29 crc kubenswrapper[4703]: I1011 03:55:29.704970 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Oct 11 03:55:29 crc kubenswrapper[4703]: E1011 03:55:29.705641 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"webhook\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"approver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-network-node-identity/network-node-identity-vrzqb" podUID="ef543e1b-8068-4ea3-b32a-61027b32e95d" Oct 11 03:55:29 crc kubenswrapper[4703]: I1011 03:55:29.711415 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"4f1d0a644c02180fdd84bdf90488b6f2b54986bc290dc6ba0063e1ecb405800b"} Oct 11 03:55:29 crc kubenswrapper[4703]: I1011 03:55:29.722607 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7905d36e-5e82-411f-bbff-19fbde136cb3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c5b679c77d5f5c431108f51cdeecf410bd4f6f76c74c2580cd3c39898eec23e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8a0a051a04bd848843a5b40337140b947c17d8378b0efbcef7d507ecadcd40b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6798d26a9f6823db8508369a2228601bbce6ce6f60799acc543e3a9074321130\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4c454ab3f6ac3f4ddf483a0039e0ddcff628cc078dd736875b86becc2f51c0d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T03:55:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:29 crc kubenswrapper[4703]: I1011 03:55:29.738143 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:29 crc kubenswrapper[4703]: I1011 03:55:29.748215 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:29 crc kubenswrapper[4703]: I1011 03:55:29.763432 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"899c5710-ccac-41a7-a51e-f13015a9b925\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d36cda809e465ae3cdbebd04ca1d0c880d982920b9e17e237e1c02da638b4b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa3ac63ddb093f123aeccd29dda14b9d42e3196818fe255e4315b2fd81e4d4bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://337d11e16f135a7996228291aa5aeb9dca3b0743c8d53ad358ad75a3f6ac604f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bdb20db3996ab197ac02919be77d320bf5f7ef288276b9f0788963b19a3d818\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bdb20db3996ab197ac02919be77d320bf5f7ef288276b9f0788963b19a3d818\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1011 03:55:23.042098 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1011 03:55:23.055510 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1583286593/tls.crt::/tmp/serving-cert-1583286593/tls.key\\\\\\\"\\\\nI1011 03:55:28.420513 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1011 03:55:28.424215 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1011 03:55:28.424249 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1011 03:55:28.424289 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1011 03:55:28.424299 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1011 03:55:28.439370 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1011 03:55:28.439543 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1011 03:55:28.439608 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1011 03:55:28.439657 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1011 03:55:28.439755 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1011 03:55:28.439805 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1011 03:55:28.439852 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1011 03:55:28.439400 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1011 03:55:28.443344 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-11T03:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6665fdb421f1acf4f7a285ccb1adcdca9e7b6191f21ed8dd1ebfa21771a0cbc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbcccfdee4565b57a78cd594f8e68b9ee6f8afb6d329818c54058ef262ae1aa7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fbcccfdee4565b57a78cd594f8e68b9ee6f8afb6d329818c54058ef262ae1aa7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T03:55:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T03:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T03:55:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:29 crc kubenswrapper[4703]: I1011 03:55:29.785599 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:29 crc kubenswrapper[4703]: I1011 03:55:29.801706 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:29 crc kubenswrapper[4703]: I1011 03:55:29.814924 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:29 crc kubenswrapper[4703]: I1011 03:55:29.831058 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:29 crc kubenswrapper[4703]: I1011 03:55:29.844101 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:29 crc kubenswrapper[4703]: I1011 03:55:29.859522 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"899c5710-ccac-41a7-a51e-f13015a9b925\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d36cda809e465ae3cdbebd04ca1d0c880d982920b9e17e237e1c02da638b4b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa3ac63ddb093f123aeccd29dda14b9d42e3196818fe255e4315b2fd81e4d4bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://337d11e16f135a7996228291aa5aeb9dca3b0743c8d53ad358ad75a3f6ac604f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f1d0a644c02180fdd84bdf90488b6f2b54986bc290dc6ba0063e1ecb405800b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bdb20db3996ab197ac02919be77d320bf5f7ef288276b9f0788963b19a3d818\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1011 03:55:23.042098 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1011 03:55:23.055510 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1583286593/tls.crt::/tmp/serving-cert-1583286593/tls.key\\\\\\\"\\\\nI1011 03:55:28.420513 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1011 03:55:28.424215 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1011 03:55:28.424249 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1011 03:55:28.424289 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1011 03:55:28.424299 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1011 03:55:28.439370 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1011 03:55:28.439543 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1011 03:55:28.439608 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1011 03:55:28.439657 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1011 03:55:28.439755 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1011 03:55:28.439805 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1011 03:55:28.439852 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1011 03:55:28.439400 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1011 03:55:28.443344 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-11T03:55:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6665fdb421f1acf4f7a285ccb1adcdca9e7b6191f21ed8dd1ebfa21771a0cbc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbcccfdee4565b57a78cd594f8e68b9ee6f8afb6d329818c54058ef262ae1aa7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fbcccfdee4565b57a78cd594f8e68b9ee6f8afb6d329818c54058ef262ae1aa7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T03:55:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T03:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T03:55:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:29 crc kubenswrapper[4703]: I1011 03:55:29.869288 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:29 crc kubenswrapper[4703]: I1011 03:55:29.878653 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7905d36e-5e82-411f-bbff-19fbde136cb3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c5b679c77d5f5c431108f51cdeecf410bd4f6f76c74c2580cd3c39898eec23e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8a0a051a04bd848843a5b40337140b947c17d8378b0efbcef7d507ecadcd40b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6798d26a9f6823db8508369a2228601bbce6ce6f60799acc543e3a9074321130\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4c454ab3f6ac3f4ddf483a0039e0ddcff628cc078dd736875b86becc2f51c0d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T03:55:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:29 crc kubenswrapper[4703]: I1011 03:55:29.891156 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:29 crc kubenswrapper[4703]: I1011 03:55:29.901379 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:29 crc kubenswrapper[4703]: I1011 03:55:29.909149 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:30 crc kubenswrapper[4703]: I1011 03:55:30.076818 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 11 03:55:30 crc kubenswrapper[4703]: E1011 03:55:30.076912 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-11 03:55:32.076866973 +0000 UTC m=+23.287348925 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 11 03:55:30 crc kubenswrapper[4703]: I1011 03:55:30.076954 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 11 03:55:30 crc kubenswrapper[4703]: I1011 03:55:30.076995 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 11 03:55:30 crc kubenswrapper[4703]: I1011 03:55:30.077032 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 11 03:55:30 crc kubenswrapper[4703]: E1011 03:55:30.077118 4703 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 11 03:55:30 crc kubenswrapper[4703]: E1011 03:55:30.077142 4703 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 11 03:55:30 crc kubenswrapper[4703]: E1011 03:55:30.077183 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-11 03:55:32.077167761 +0000 UTC m=+23.287649723 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 11 03:55:30 crc kubenswrapper[4703]: E1011 03:55:30.077192 4703 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 11 03:55:30 crc kubenswrapper[4703]: E1011 03:55:30.077216 4703 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 11 03:55:30 crc kubenswrapper[4703]: E1011 03:55:30.077234 4703 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 11 03:55:30 crc kubenswrapper[4703]: E1011 03:55:30.077241 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-11 03:55:32.077216773 +0000 UTC m=+23.287698705 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 11 03:55:30 crc kubenswrapper[4703]: E1011 03:55:30.077285 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-11 03:55:32.077268904 +0000 UTC m=+23.287750856 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 11 03:55:30 crc kubenswrapper[4703]: I1011 03:55:30.178074 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 11 03:55:30 crc kubenswrapper[4703]: E1011 03:55:30.178388 4703 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 11 03:55:30 crc kubenswrapper[4703]: E1011 03:55:30.178457 4703 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 11 03:55:30 crc kubenswrapper[4703]: E1011 03:55:30.178518 4703 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 11 03:55:30 crc kubenswrapper[4703]: E1011 03:55:30.178657 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-11 03:55:32.178623288 +0000 UTC m=+23.389105270 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 11 03:55:30 crc kubenswrapper[4703]: I1011 03:55:30.532858 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 11 03:55:30 crc kubenswrapper[4703]: I1011 03:55:30.532891 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 11 03:55:30 crc kubenswrapper[4703]: I1011 03:55:30.532971 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 11 03:55:30 crc kubenswrapper[4703]: E1011 03:55:30.533149 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 11 03:55:30 crc kubenswrapper[4703]: E1011 03:55:30.533278 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 11 03:55:30 crc kubenswrapper[4703]: E1011 03:55:30.533399 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 11 03:55:30 crc kubenswrapper[4703]: I1011 03:55:30.715301 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 11 03:55:32 crc kubenswrapper[4703]: I1011 03:55:32.095713 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 11 03:55:32 crc kubenswrapper[4703]: I1011 03:55:32.095851 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 11 03:55:32 crc kubenswrapper[4703]: I1011 03:55:32.095896 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 11 03:55:32 crc kubenswrapper[4703]: E1011 03:55:32.096018 4703 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 11 03:55:32 crc kubenswrapper[4703]: E1011 03:55:32.096107 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-11 03:55:36.095909107 +0000 UTC m=+27.306391069 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 11 03:55:32 crc kubenswrapper[4703]: I1011 03:55:32.096277 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 11 03:55:32 crc kubenswrapper[4703]: E1011 03:55:32.096332 4703 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 11 03:55:32 crc kubenswrapper[4703]: E1011 03:55:32.096407 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-11 03:55:36.096363219 +0000 UTC m=+27.306845171 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 11 03:55:32 crc kubenswrapper[4703]: E1011 03:55:32.096521 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-11 03:55:36.096502283 +0000 UTC m=+27.306984235 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 11 03:55:32 crc kubenswrapper[4703]: E1011 03:55:32.096558 4703 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 11 03:55:32 crc kubenswrapper[4703]: E1011 03:55:32.096661 4703 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 11 03:55:32 crc kubenswrapper[4703]: E1011 03:55:32.096685 4703 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 11 03:55:32 crc kubenswrapper[4703]: E1011 03:55:32.096814 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-11 03:55:36.09677484 +0000 UTC m=+27.307256792 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 11 03:55:32 crc kubenswrapper[4703]: I1011 03:55:32.198133 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 11 03:55:32 crc kubenswrapper[4703]: E1011 03:55:32.198404 4703 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 11 03:55:32 crc kubenswrapper[4703]: E1011 03:55:32.198465 4703 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 11 03:55:32 crc kubenswrapper[4703]: E1011 03:55:32.198537 4703 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 11 03:55:32 crc kubenswrapper[4703]: E1011 03:55:32.198636 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-11 03:55:36.198604367 +0000 UTC m=+27.409086299 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 11 03:55:32 crc kubenswrapper[4703]: I1011 03:55:32.533383 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 11 03:55:32 crc kubenswrapper[4703]: E1011 03:55:32.533657 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 11 03:55:32 crc kubenswrapper[4703]: I1011 03:55:32.533765 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 11 03:55:32 crc kubenswrapper[4703]: I1011 03:55:32.533850 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 11 03:55:32 crc kubenswrapper[4703]: E1011 03:55:32.534087 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 11 03:55:32 crc kubenswrapper[4703]: E1011 03:55:32.534319 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 11 03:55:34 crc kubenswrapper[4703]: I1011 03:55:34.532649 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 11 03:55:34 crc kubenswrapper[4703]: I1011 03:55:34.532670 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 11 03:55:34 crc kubenswrapper[4703]: I1011 03:55:34.532860 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 11 03:55:34 crc kubenswrapper[4703]: E1011 03:55:34.532994 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 11 03:55:34 crc kubenswrapper[4703]: E1011 03:55:34.533129 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 11 03:55:34 crc kubenswrapper[4703]: E1011 03:55:34.533251 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 11 03:55:34 crc kubenswrapper[4703]: I1011 03:55:34.808786 4703 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 11 03:55:34 crc kubenswrapper[4703]: I1011 03:55:34.810162 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 03:55:34 crc kubenswrapper[4703]: I1011 03:55:34.810312 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 03:55:34 crc kubenswrapper[4703]: I1011 03:55:34.810427 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 03:55:34 crc kubenswrapper[4703]: I1011 03:55:34.810650 4703 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 11 03:55:34 crc kubenswrapper[4703]: I1011 03:55:34.821630 4703 kubelet_node_status.go:115] "Node was previously registered" node="crc" Oct 11 03:55:34 crc kubenswrapper[4703]: I1011 03:55:34.821935 4703 kubelet_node_status.go:79] "Successfully registered node" node="crc" Oct 11 03:55:34 crc kubenswrapper[4703]: I1011 03:55:34.823277 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 03:55:34 crc kubenswrapper[4703]: I1011 03:55:34.823333 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 03:55:34 crc kubenswrapper[4703]: I1011 03:55:34.823350 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 03:55:34 crc kubenswrapper[4703]: I1011 03:55:34.823374 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 03:55:34 crc kubenswrapper[4703]: I1011 03:55:34.823391 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T03:55:34Z","lastTransitionTime":"2025-10-11T03:55:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 03:55:34 crc kubenswrapper[4703]: E1011 03:55:34.847373 4703 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-11T03:55:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-11T03:55:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-11T03:55:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-11T03:55:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6e083b1c-5c9b-4402-b8e3-8c3311b0c688\\\",\\\"systemUUID\\\":\\\"243e2b7e-609f-4e6f-ab38-53c6a8452606\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:34 crc kubenswrapper[4703]: I1011 03:55:34.853090 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 03:55:34 crc kubenswrapper[4703]: I1011 03:55:34.853272 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 03:55:34 crc kubenswrapper[4703]: I1011 03:55:34.853293 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 03:55:34 crc kubenswrapper[4703]: I1011 03:55:34.853319 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 03:55:34 crc kubenswrapper[4703]: I1011 03:55:34.853338 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T03:55:34Z","lastTransitionTime":"2025-10-11T03:55:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 03:55:34 crc kubenswrapper[4703]: E1011 03:55:34.870344 4703 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-11T03:55:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-11T03:55:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-11T03:55:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-11T03:55:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6e083b1c-5c9b-4402-b8e3-8c3311b0c688\\\",\\\"systemUUID\\\":\\\"243e2b7e-609f-4e6f-ab38-53c6a8452606\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:34 crc kubenswrapper[4703]: I1011 03:55:34.875737 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 03:55:34 crc kubenswrapper[4703]: I1011 03:55:34.875818 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 03:55:34 crc kubenswrapper[4703]: I1011 03:55:34.875843 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 03:55:34 crc kubenswrapper[4703]: I1011 03:55:34.875872 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 03:55:34 crc kubenswrapper[4703]: I1011 03:55:34.875896 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T03:55:34Z","lastTransitionTime":"2025-10-11T03:55:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 03:55:34 crc kubenswrapper[4703]: E1011 03:55:34.892428 4703 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-11T03:55:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-11T03:55:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-11T03:55:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-11T03:55:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6e083b1c-5c9b-4402-b8e3-8c3311b0c688\\\",\\\"systemUUID\\\":\\\"243e2b7e-609f-4e6f-ab38-53c6a8452606\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:34 crc kubenswrapper[4703]: I1011 03:55:34.896815 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 03:55:34 crc kubenswrapper[4703]: I1011 03:55:34.896851 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 03:55:34 crc kubenswrapper[4703]: I1011 03:55:34.896863 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 03:55:34 crc kubenswrapper[4703]: I1011 03:55:34.896882 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 03:55:34 crc kubenswrapper[4703]: I1011 03:55:34.896894 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T03:55:34Z","lastTransitionTime":"2025-10-11T03:55:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 03:55:34 crc kubenswrapper[4703]: E1011 03:55:34.911441 4703 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-11T03:55:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-11T03:55:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-11T03:55:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-11T03:55:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6e083b1c-5c9b-4402-b8e3-8c3311b0c688\\\",\\\"systemUUID\\\":\\\"243e2b7e-609f-4e6f-ab38-53c6a8452606\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:34 crc kubenswrapper[4703]: I1011 03:55:34.915871 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 03:55:34 crc kubenswrapper[4703]: I1011 03:55:34.915942 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 03:55:34 crc kubenswrapper[4703]: I1011 03:55:34.915961 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 03:55:34 crc kubenswrapper[4703]: I1011 03:55:34.915984 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 03:55:34 crc kubenswrapper[4703]: I1011 03:55:34.916001 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T03:55:34Z","lastTransitionTime":"2025-10-11T03:55:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 03:55:34 crc kubenswrapper[4703]: E1011 03:55:34.933188 4703 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-11T03:55:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-11T03:55:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-11T03:55:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-11T03:55:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6e083b1c-5c9b-4402-b8e3-8c3311b0c688\\\",\\\"systemUUID\\\":\\\"243e2b7e-609f-4e6f-ab38-53c6a8452606\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:34 crc kubenswrapper[4703]: E1011 03:55:34.933404 4703 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 11 03:55:34 crc kubenswrapper[4703]: I1011 03:55:34.935763 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 03:55:34 crc kubenswrapper[4703]: I1011 03:55:34.935841 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 03:55:34 crc kubenswrapper[4703]: I1011 03:55:34.935860 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 03:55:34 crc kubenswrapper[4703]: I1011 03:55:34.935884 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 03:55:34 crc kubenswrapper[4703]: I1011 03:55:34.935903 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T03:55:34Z","lastTransitionTime":"2025-10-11T03:55:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 03:55:35 crc kubenswrapper[4703]: I1011 03:55:35.038725 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 03:55:35 crc kubenswrapper[4703]: I1011 03:55:35.038785 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 03:55:35 crc kubenswrapper[4703]: I1011 03:55:35.038803 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 03:55:35 crc kubenswrapper[4703]: I1011 03:55:35.038826 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 03:55:35 crc kubenswrapper[4703]: I1011 03:55:35.038843 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T03:55:35Z","lastTransitionTime":"2025-10-11T03:55:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 03:55:35 crc kubenswrapper[4703]: I1011 03:55:35.141738 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 03:55:35 crc kubenswrapper[4703]: I1011 03:55:35.141813 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 03:55:35 crc kubenswrapper[4703]: I1011 03:55:35.141836 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 03:55:35 crc kubenswrapper[4703]: I1011 03:55:35.141874 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 03:55:35 crc kubenswrapper[4703]: I1011 03:55:35.141898 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T03:55:35Z","lastTransitionTime":"2025-10-11T03:55:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 03:55:35 crc kubenswrapper[4703]: I1011 03:55:35.249994 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 03:55:35 crc kubenswrapper[4703]: I1011 03:55:35.250042 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 03:55:35 crc kubenswrapper[4703]: I1011 03:55:35.250062 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 03:55:35 crc kubenswrapper[4703]: I1011 03:55:35.250079 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 03:55:35 crc kubenswrapper[4703]: I1011 03:55:35.250092 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T03:55:35Z","lastTransitionTime":"2025-10-11T03:55:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 03:55:35 crc kubenswrapper[4703]: I1011 03:55:35.352398 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 03:55:35 crc kubenswrapper[4703]: I1011 03:55:35.352452 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 03:55:35 crc kubenswrapper[4703]: I1011 03:55:35.352492 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 03:55:35 crc kubenswrapper[4703]: I1011 03:55:35.352514 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 03:55:35 crc kubenswrapper[4703]: I1011 03:55:35.352531 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T03:55:35Z","lastTransitionTime":"2025-10-11T03:55:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 03:55:35 crc kubenswrapper[4703]: I1011 03:55:35.466100 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 03:55:35 crc kubenswrapper[4703]: I1011 03:55:35.466148 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 03:55:35 crc kubenswrapper[4703]: I1011 03:55:35.466159 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 03:55:35 crc kubenswrapper[4703]: I1011 03:55:35.466181 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 03:55:35 crc kubenswrapper[4703]: I1011 03:55:35.466193 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T03:55:35Z","lastTransitionTime":"2025-10-11T03:55:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 03:55:35 crc kubenswrapper[4703]: I1011 03:55:35.568226 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 03:55:35 crc kubenswrapper[4703]: I1011 03:55:35.568259 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 03:55:35 crc kubenswrapper[4703]: I1011 03:55:35.568267 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 03:55:35 crc kubenswrapper[4703]: I1011 03:55:35.568280 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 03:55:35 crc kubenswrapper[4703]: I1011 03:55:35.568288 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T03:55:35Z","lastTransitionTime":"2025-10-11T03:55:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 03:55:35 crc kubenswrapper[4703]: I1011 03:55:35.670939 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 03:55:35 crc kubenswrapper[4703]: I1011 03:55:35.670976 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 03:55:35 crc kubenswrapper[4703]: I1011 03:55:35.670988 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 03:55:35 crc kubenswrapper[4703]: I1011 03:55:35.671004 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 03:55:35 crc kubenswrapper[4703]: I1011 03:55:35.671016 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T03:55:35Z","lastTransitionTime":"2025-10-11T03:55:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 03:55:35 crc kubenswrapper[4703]: I1011 03:55:35.773656 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 03:55:35 crc kubenswrapper[4703]: I1011 03:55:35.773686 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 03:55:35 crc kubenswrapper[4703]: I1011 03:55:35.773694 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 03:55:35 crc kubenswrapper[4703]: I1011 03:55:35.773706 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 03:55:35 crc kubenswrapper[4703]: I1011 03:55:35.773715 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T03:55:35Z","lastTransitionTime":"2025-10-11T03:55:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 03:55:35 crc kubenswrapper[4703]: I1011 03:55:35.801730 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-pxxpm"] Oct 11 03:55:35 crc kubenswrapper[4703]: I1011 03:55:35.802063 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-pxxpm" Oct 11 03:55:35 crc kubenswrapper[4703]: I1011 03:55:35.804566 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Oct 11 03:55:35 crc kubenswrapper[4703]: I1011 03:55:35.804617 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Oct 11 03:55:35 crc kubenswrapper[4703]: I1011 03:55:35.804905 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Oct 11 03:55:35 crc kubenswrapper[4703]: I1011 03:55:35.819708 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:35 crc kubenswrapper[4703]: I1011 03:55:35.831988 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/cf5ff297-df57-4520-a699-dd9f0d3eb7f9-hosts-file\") pod \"node-resolver-pxxpm\" (UID: \"cf5ff297-df57-4520-a699-dd9f0d3eb7f9\") " pod="openshift-dns/node-resolver-pxxpm" Oct 11 03:55:35 crc kubenswrapper[4703]: I1011 03:55:35.832024 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-974bn\" (UniqueName: \"kubernetes.io/projected/cf5ff297-df57-4520-a699-dd9f0d3eb7f9-kube-api-access-974bn\") pod \"node-resolver-pxxpm\" (UID: \"cf5ff297-df57-4520-a699-dd9f0d3eb7f9\") " pod="openshift-dns/node-resolver-pxxpm" Oct 11 03:55:35 crc kubenswrapper[4703]: I1011 03:55:35.836434 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:35 crc kubenswrapper[4703]: I1011 03:55:35.854471 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"899c5710-ccac-41a7-a51e-f13015a9b925\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d36cda809e465ae3cdbebd04ca1d0c880d982920b9e17e237e1c02da638b4b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa3ac63ddb093f123aeccd29dda14b9d42e3196818fe255e4315b2fd81e4d4bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://337d11e16f135a7996228291aa5aeb9dca3b0743c8d53ad358ad75a3f6ac604f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f1d0a644c02180fdd84bdf90488b6f2b54986bc290dc6ba0063e1ecb405800b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bdb20db3996ab197ac02919be77d320bf5f7ef288276b9f0788963b19a3d818\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1011 03:55:23.042098 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1011 03:55:23.055510 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1583286593/tls.crt::/tmp/serving-cert-1583286593/tls.key\\\\\\\"\\\\nI1011 03:55:28.420513 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1011 03:55:28.424215 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1011 03:55:28.424249 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1011 03:55:28.424289 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1011 03:55:28.424299 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1011 03:55:28.439370 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1011 03:55:28.439543 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1011 03:55:28.439608 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1011 03:55:28.439657 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1011 03:55:28.439755 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1011 03:55:28.439805 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1011 03:55:28.439852 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1011 03:55:28.439400 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1011 03:55:28.443344 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-11T03:55:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6665fdb421f1acf4f7a285ccb1adcdca9e7b6191f21ed8dd1ebfa21771a0cbc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbcccfdee4565b57a78cd594f8e68b9ee6f8afb6d329818c54058ef262ae1aa7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fbcccfdee4565b57a78cd594f8e68b9ee6f8afb6d329818c54058ef262ae1aa7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T03:55:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T03:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T03:55:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:35 crc kubenswrapper[4703]: I1011 03:55:35.865320 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:35 crc kubenswrapper[4703]: I1011 03:55:35.875458 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7905d36e-5e82-411f-bbff-19fbde136cb3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c5b679c77d5f5c431108f51cdeecf410bd4f6f76c74c2580cd3c39898eec23e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8a0a051a04bd848843a5b40337140b947c17d8378b0efbcef7d507ecadcd40b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6798d26a9f6823db8508369a2228601bbce6ce6f60799acc543e3a9074321130\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4c454ab3f6ac3f4ddf483a0039e0ddcff628cc078dd736875b86becc2f51c0d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T03:55:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:35 crc kubenswrapper[4703]: I1011 03:55:35.876462 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 03:55:35 crc kubenswrapper[4703]: I1011 03:55:35.876520 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 03:55:35 crc kubenswrapper[4703]: I1011 03:55:35.876532 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 03:55:35 crc kubenswrapper[4703]: I1011 03:55:35.876550 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 03:55:35 crc kubenswrapper[4703]: I1011 03:55:35.876563 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T03:55:35Z","lastTransitionTime":"2025-10-11T03:55:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 03:55:35 crc kubenswrapper[4703]: I1011 03:55:35.887304 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:35 crc kubenswrapper[4703]: I1011 03:55:35.899268 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:35 crc kubenswrapper[4703]: I1011 03:55:35.909272 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:35 crc kubenswrapper[4703]: I1011 03:55:35.924763 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pxxpm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf5ff297-df57-4520-a699-dd9f0d3eb7f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:35Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:35Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-974bn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T03:55:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pxxpm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:35 crc kubenswrapper[4703]: I1011 03:55:35.933199 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/cf5ff297-df57-4520-a699-dd9f0d3eb7f9-hosts-file\") pod \"node-resolver-pxxpm\" (UID: \"cf5ff297-df57-4520-a699-dd9f0d3eb7f9\") " pod="openshift-dns/node-resolver-pxxpm" Oct 11 03:55:35 crc kubenswrapper[4703]: I1011 03:55:35.933256 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-974bn\" (UniqueName: \"kubernetes.io/projected/cf5ff297-df57-4520-a699-dd9f0d3eb7f9-kube-api-access-974bn\") pod \"node-resolver-pxxpm\" (UID: \"cf5ff297-df57-4520-a699-dd9f0d3eb7f9\") " pod="openshift-dns/node-resolver-pxxpm" Oct 11 03:55:35 crc kubenswrapper[4703]: I1011 03:55:35.933389 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/cf5ff297-df57-4520-a699-dd9f0d3eb7f9-hosts-file\") pod \"node-resolver-pxxpm\" (UID: \"cf5ff297-df57-4520-a699-dd9f0d3eb7f9\") " pod="openshift-dns/node-resolver-pxxpm" Oct 11 03:55:35 crc kubenswrapper[4703]: I1011 03:55:35.956254 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-974bn\" (UniqueName: \"kubernetes.io/projected/cf5ff297-df57-4520-a699-dd9f0d3eb7f9-kube-api-access-974bn\") pod \"node-resolver-pxxpm\" (UID: \"cf5ff297-df57-4520-a699-dd9f0d3eb7f9\") " pod="openshift-dns/node-resolver-pxxpm" Oct 11 03:55:35 crc kubenswrapper[4703]: I1011 03:55:35.979228 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 03:55:35 crc kubenswrapper[4703]: I1011 03:55:35.979290 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 03:55:35 crc kubenswrapper[4703]: I1011 03:55:35.979303 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 03:55:35 crc kubenswrapper[4703]: I1011 03:55:35.979321 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 03:55:35 crc kubenswrapper[4703]: I1011 03:55:35.979333 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T03:55:35Z","lastTransitionTime":"2025-10-11T03:55:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.082807 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.082870 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.082886 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.082905 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.082918 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T03:55:36Z","lastTransitionTime":"2025-10-11T03:55:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.114358 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-pxxpm" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.135514 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.135638 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.135687 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 11 03:55:36 crc kubenswrapper[4703]: E1011 03:55:36.135774 4703 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 11 03:55:36 crc kubenswrapper[4703]: E1011 03:55:36.135789 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-11 03:55:44.135743382 +0000 UTC m=+35.346225354 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 11 03:55:36 crc kubenswrapper[4703]: E1011 03:55:36.135848 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-11 03:55:44.135823284 +0000 UTC m=+35.346305296 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.135890 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 11 03:55:36 crc kubenswrapper[4703]: E1011 03:55:36.135949 4703 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 11 03:55:36 crc kubenswrapper[4703]: E1011 03:55:36.135990 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-11 03:55:44.135978628 +0000 UTC m=+35.346460690 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 11 03:55:36 crc kubenswrapper[4703]: E1011 03:55:36.136150 4703 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 11 03:55:36 crc kubenswrapper[4703]: E1011 03:55:36.136193 4703 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 11 03:55:36 crc kubenswrapper[4703]: E1011 03:55:36.136218 4703 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 11 03:55:36 crc kubenswrapper[4703]: E1011 03:55:36.136287 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-11 03:55:44.136266126 +0000 UTC m=+35.346748128 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.186161 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.186505 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.186524 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.186546 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.186564 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T03:55:36Z","lastTransitionTime":"2025-10-11T03:55:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.189172 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-6b7d5"] Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.189632 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-q24vt"] Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.189808 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-6b7d5" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.190450 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-q24vt" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.191837 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.192586 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.192622 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.192972 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.193036 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.193647 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.193883 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.194094 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.194720 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.194759 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-vgqng"] Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.195157 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-vgqng" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.195963 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.196944 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.197336 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.215528 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7905d36e-5e82-411f-bbff-19fbde136cb3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c5b679c77d5f5c431108f51cdeecf410bd4f6f76c74c2580cd3c39898eec23e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8a0a051a04bd848843a5b40337140b947c17d8378b0efbcef7d507ecadcd40b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6798d26a9f6823db8508369a2228601bbce6ce6f60799acc543e3a9074321130\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4c454ab3f6ac3f4ddf483a0039e0ddcff628cc078dd736875b86becc2f51c0d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T03:55:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.244940 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6b6de354-b085-4f66-ac6c-4eb6005aa965-system-cni-dir\") pod \"multus-vgqng\" (UID: \"6b6de354-b085-4f66-ac6c-4eb6005aa965\") " pod="openshift-multus/multus-vgqng" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.245002 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/6b6de354-b085-4f66-ac6c-4eb6005aa965-hostroot\") pod \"multus-vgqng\" (UID: \"6b6de354-b085-4f66-ac6c-4eb6005aa965\") " pod="openshift-multus/multus-vgqng" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.245030 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/74f832fc-6791-47d6-a9b3-07d923e053dc-rootfs\") pod \"machine-config-daemon-6b7d5\" (UID: \"74f832fc-6791-47d6-a9b3-07d923e053dc\") " pod="openshift-machine-config-operator/machine-config-daemon-6b7d5" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.245054 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6b6de354-b085-4f66-ac6c-4eb6005aa965-multus-cni-dir\") pod \"multus-vgqng\" (UID: \"6b6de354-b085-4f66-ac6c-4eb6005aa965\") " pod="openshift-multus/multus-vgqng" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.245074 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/6b6de354-b085-4f66-ac6c-4eb6005aa965-multus-daemon-config\") pod \"multus-vgqng\" (UID: \"6b6de354-b085-4f66-ac6c-4eb6005aa965\") " pod="openshift-multus/multus-vgqng" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.245095 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6b6de354-b085-4f66-ac6c-4eb6005aa965-multus-conf-dir\") pod \"multus-vgqng\" (UID: \"6b6de354-b085-4f66-ac6c-4eb6005aa965\") " pod="openshift-multus/multus-vgqng" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.245118 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b9j4r\" (UniqueName: \"kubernetes.io/projected/c2f31724-195d-43c1-8048-188d3646ca61-kube-api-access-b9j4r\") pod \"multus-additional-cni-plugins-q24vt\" (UID: \"c2f31724-195d-43c1-8048-188d3646ca61\") " pod="openshift-multus/multus-additional-cni-plugins-q24vt" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.245139 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/6b6de354-b085-4f66-ac6c-4eb6005aa965-multus-socket-dir-parent\") pod \"multus-vgqng\" (UID: \"6b6de354-b085-4f66-ac6c-4eb6005aa965\") " pod="openshift-multus/multus-vgqng" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.245161 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c2f31724-195d-43c1-8048-188d3646ca61-cni-binary-copy\") pod \"multus-additional-cni-plugins-q24vt\" (UID: \"c2f31724-195d-43c1-8048-188d3646ca61\") " pod="openshift-multus/multus-additional-cni-plugins-q24vt" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.245182 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/6b6de354-b085-4f66-ac6c-4eb6005aa965-host-run-multus-certs\") pod \"multus-vgqng\" (UID: \"6b6de354-b085-4f66-ac6c-4eb6005aa965\") " pod="openshift-multus/multus-vgqng" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.245201 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/74f832fc-6791-47d6-a9b3-07d923e053dc-proxy-tls\") pod \"machine-config-daemon-6b7d5\" (UID: \"74f832fc-6791-47d6-a9b3-07d923e053dc\") " pod="openshift-machine-config-operator/machine-config-daemon-6b7d5" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.245219 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/6b6de354-b085-4f66-ac6c-4eb6005aa965-host-run-k8s-cni-cncf-io\") pod \"multus-vgqng\" (UID: \"6b6de354-b085-4f66-ac6c-4eb6005aa965\") " pod="openshift-multus/multus-vgqng" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.245238 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6b6de354-b085-4f66-ac6c-4eb6005aa965-host-run-netns\") pod \"multus-vgqng\" (UID: \"6b6de354-b085-4f66-ac6c-4eb6005aa965\") " pod="openshift-multus/multus-vgqng" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.245261 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6b6de354-b085-4f66-ac6c-4eb6005aa965-etc-kubernetes\") pod \"multus-vgqng\" (UID: \"6b6de354-b085-4f66-ac6c-4eb6005aa965\") " pod="openshift-multus/multus-vgqng" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.245283 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/74f832fc-6791-47d6-a9b3-07d923e053dc-mcd-auth-proxy-config\") pod \"machine-config-daemon-6b7d5\" (UID: \"74f832fc-6791-47d6-a9b3-07d923e053dc\") " pod="openshift-machine-config-operator/machine-config-daemon-6b7d5" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.245321 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c2f31724-195d-43c1-8048-188d3646ca61-os-release\") pod \"multus-additional-cni-plugins-q24vt\" (UID: \"c2f31724-195d-43c1-8048-188d3646ca61\") " pod="openshift-multus/multus-additional-cni-plugins-q24vt" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.245350 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6b6de354-b085-4f66-ac6c-4eb6005aa965-host-var-lib-cni-bin\") pod \"multus-vgqng\" (UID: \"6b6de354-b085-4f66-ac6c-4eb6005aa965\") " pod="openshift-multus/multus-vgqng" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.245369 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/6b6de354-b085-4f66-ac6c-4eb6005aa965-os-release\") pod \"multus-vgqng\" (UID: \"6b6de354-b085-4f66-ac6c-4eb6005aa965\") " pod="openshift-multus/multus-vgqng" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.245452 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/6b6de354-b085-4f66-ac6c-4eb6005aa965-cnibin\") pod \"multus-vgqng\" (UID: \"6b6de354-b085-4f66-ac6c-4eb6005aa965\") " pod="openshift-multus/multus-vgqng" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.245544 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c2f31724-195d-43c1-8048-188d3646ca61-tuning-conf-dir\") pod \"multus-additional-cni-plugins-q24vt\" (UID: \"c2f31724-195d-43c1-8048-188d3646ca61\") " pod="openshift-multus/multus-additional-cni-plugins-q24vt" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.245577 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/c2f31724-195d-43c1-8048-188d3646ca61-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-q24vt\" (UID: \"c2f31724-195d-43c1-8048-188d3646ca61\") " pod="openshift-multus/multus-additional-cni-plugins-q24vt" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.245644 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.245698 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/6b6de354-b085-4f66-ac6c-4eb6005aa965-host-var-lib-cni-multus\") pod \"multus-vgqng\" (UID: \"6b6de354-b085-4f66-ac6c-4eb6005aa965\") " pod="openshift-multus/multus-vgqng" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.245742 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c2f31724-195d-43c1-8048-188d3646ca61-system-cni-dir\") pod \"multus-additional-cni-plugins-q24vt\" (UID: \"c2f31724-195d-43c1-8048-188d3646ca61\") " pod="openshift-multus/multus-additional-cni-plugins-q24vt" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.245783 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-59z7g\" (UniqueName: \"kubernetes.io/projected/6b6de354-b085-4f66-ac6c-4eb6005aa965-kube-api-access-59z7g\") pod \"multus-vgqng\" (UID: \"6b6de354-b085-4f66-ac6c-4eb6005aa965\") " pod="openshift-multus/multus-vgqng" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.245830 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/6b6de354-b085-4f66-ac6c-4eb6005aa965-cni-binary-copy\") pod \"multus-vgqng\" (UID: \"6b6de354-b085-4f66-ac6c-4eb6005aa965\") " pod="openshift-multus/multus-vgqng" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.245874 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6b6de354-b085-4f66-ac6c-4eb6005aa965-host-var-lib-kubelet\") pod \"multus-vgqng\" (UID: \"6b6de354-b085-4f66-ac6c-4eb6005aa965\") " pod="openshift-multus/multus-vgqng" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.245924 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8vscm\" (UniqueName: \"kubernetes.io/projected/74f832fc-6791-47d6-a9b3-07d923e053dc-kube-api-access-8vscm\") pod \"machine-config-daemon-6b7d5\" (UID: \"74f832fc-6791-47d6-a9b3-07d923e053dc\") " pod="openshift-machine-config-operator/machine-config-daemon-6b7d5" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.245969 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c2f31724-195d-43c1-8048-188d3646ca61-cnibin\") pod \"multus-additional-cni-plugins-q24vt\" (UID: \"c2f31724-195d-43c1-8048-188d3646ca61\") " pod="openshift-multus/multus-additional-cni-plugins-q24vt" Oct 11 03:55:36 crc kubenswrapper[4703]: E1011 03:55:36.246242 4703 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 11 03:55:36 crc kubenswrapper[4703]: E1011 03:55:36.246281 4703 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 11 03:55:36 crc kubenswrapper[4703]: E1011 03:55:36.246303 4703 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 11 03:55:36 crc kubenswrapper[4703]: E1011 03:55:36.246388 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-11 03:55:44.246357943 +0000 UTC m=+35.456839905 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.250877 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.291067 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.297189 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.297230 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.297241 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.297258 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.297269 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T03:55:36Z","lastTransitionTime":"2025-10-11T03:55:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.337603 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.347281 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6b6de354-b085-4f66-ac6c-4eb6005aa965-multus-conf-dir\") pod \"multus-vgqng\" (UID: \"6b6de354-b085-4f66-ac6c-4eb6005aa965\") " pod="openshift-multus/multus-vgqng" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.347315 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b9j4r\" (UniqueName: \"kubernetes.io/projected/c2f31724-195d-43c1-8048-188d3646ca61-kube-api-access-b9j4r\") pod \"multus-additional-cni-plugins-q24vt\" (UID: \"c2f31724-195d-43c1-8048-188d3646ca61\") " pod="openshift-multus/multus-additional-cni-plugins-q24vt" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.347332 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/6b6de354-b085-4f66-ac6c-4eb6005aa965-multus-socket-dir-parent\") pod \"multus-vgqng\" (UID: \"6b6de354-b085-4f66-ac6c-4eb6005aa965\") " pod="openshift-multus/multus-vgqng" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.347348 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c2f31724-195d-43c1-8048-188d3646ca61-cni-binary-copy\") pod \"multus-additional-cni-plugins-q24vt\" (UID: \"c2f31724-195d-43c1-8048-188d3646ca61\") " pod="openshift-multus/multus-additional-cni-plugins-q24vt" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.347365 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/6b6de354-b085-4f66-ac6c-4eb6005aa965-host-run-multus-certs\") pod \"multus-vgqng\" (UID: \"6b6de354-b085-4f66-ac6c-4eb6005aa965\") " pod="openshift-multus/multus-vgqng" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.347381 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/74f832fc-6791-47d6-a9b3-07d923e053dc-proxy-tls\") pod \"machine-config-daemon-6b7d5\" (UID: \"74f832fc-6791-47d6-a9b3-07d923e053dc\") " pod="openshift-machine-config-operator/machine-config-daemon-6b7d5" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.347386 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6b6de354-b085-4f66-ac6c-4eb6005aa965-multus-conf-dir\") pod \"multus-vgqng\" (UID: \"6b6de354-b085-4f66-ac6c-4eb6005aa965\") " pod="openshift-multus/multus-vgqng" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.347397 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/6b6de354-b085-4f66-ac6c-4eb6005aa965-host-run-k8s-cni-cncf-io\") pod \"multus-vgqng\" (UID: \"6b6de354-b085-4f66-ac6c-4eb6005aa965\") " pod="openshift-multus/multus-vgqng" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.347424 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6b6de354-b085-4f66-ac6c-4eb6005aa965-host-run-netns\") pod \"multus-vgqng\" (UID: \"6b6de354-b085-4f66-ac6c-4eb6005aa965\") " pod="openshift-multus/multus-vgqng" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.347441 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6b6de354-b085-4f66-ac6c-4eb6005aa965-etc-kubernetes\") pod \"multus-vgqng\" (UID: \"6b6de354-b085-4f66-ac6c-4eb6005aa965\") " pod="openshift-multus/multus-vgqng" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.347455 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/74f832fc-6791-47d6-a9b3-07d923e053dc-mcd-auth-proxy-config\") pod \"machine-config-daemon-6b7d5\" (UID: \"74f832fc-6791-47d6-a9b3-07d923e053dc\") " pod="openshift-machine-config-operator/machine-config-daemon-6b7d5" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.347493 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c2f31724-195d-43c1-8048-188d3646ca61-os-release\") pod \"multus-additional-cni-plugins-q24vt\" (UID: \"c2f31724-195d-43c1-8048-188d3646ca61\") " pod="openshift-multus/multus-additional-cni-plugins-q24vt" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.347513 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6b6de354-b085-4f66-ac6c-4eb6005aa965-host-var-lib-cni-bin\") pod \"multus-vgqng\" (UID: \"6b6de354-b085-4f66-ac6c-4eb6005aa965\") " pod="openshift-multus/multus-vgqng" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.347530 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/6b6de354-b085-4f66-ac6c-4eb6005aa965-os-release\") pod \"multus-vgqng\" (UID: \"6b6de354-b085-4f66-ac6c-4eb6005aa965\") " pod="openshift-multus/multus-vgqng" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.347525 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/6b6de354-b085-4f66-ac6c-4eb6005aa965-host-run-multus-certs\") pod \"multus-vgqng\" (UID: \"6b6de354-b085-4f66-ac6c-4eb6005aa965\") " pod="openshift-multus/multus-vgqng" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.347557 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/6b6de354-b085-4f66-ac6c-4eb6005aa965-multus-socket-dir-parent\") pod \"multus-vgqng\" (UID: \"6b6de354-b085-4f66-ac6c-4eb6005aa965\") " pod="openshift-multus/multus-vgqng" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.347593 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/6b6de354-b085-4f66-ac6c-4eb6005aa965-host-run-k8s-cni-cncf-io\") pod \"multus-vgqng\" (UID: \"6b6de354-b085-4f66-ac6c-4eb6005aa965\") " pod="openshift-multus/multus-vgqng" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.347577 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/6b6de354-b085-4f66-ac6c-4eb6005aa965-cnibin\") pod \"multus-vgqng\" (UID: \"6b6de354-b085-4f66-ac6c-4eb6005aa965\") " pod="openshift-multus/multus-vgqng" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.347546 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/6b6de354-b085-4f66-ac6c-4eb6005aa965-cnibin\") pod \"multus-vgqng\" (UID: \"6b6de354-b085-4f66-ac6c-4eb6005aa965\") " pod="openshift-multus/multus-vgqng" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.347647 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c2f31724-195d-43c1-8048-188d3646ca61-os-release\") pod \"multus-additional-cni-plugins-q24vt\" (UID: \"c2f31724-195d-43c1-8048-188d3646ca61\") " pod="openshift-multus/multus-additional-cni-plugins-q24vt" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.347664 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c2f31724-195d-43c1-8048-188d3646ca61-tuning-conf-dir\") pod \"multus-additional-cni-plugins-q24vt\" (UID: \"c2f31724-195d-43c1-8048-188d3646ca61\") " pod="openshift-multus/multus-additional-cni-plugins-q24vt" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.347690 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/c2f31724-195d-43c1-8048-188d3646ca61-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-q24vt\" (UID: \"c2f31724-195d-43c1-8048-188d3646ca61\") " pod="openshift-multus/multus-additional-cni-plugins-q24vt" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.347696 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/6b6de354-b085-4f66-ac6c-4eb6005aa965-os-release\") pod \"multus-vgqng\" (UID: \"6b6de354-b085-4f66-ac6c-4eb6005aa965\") " pod="openshift-multus/multus-vgqng" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.347722 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6b6de354-b085-4f66-ac6c-4eb6005aa965-host-run-netns\") pod \"multus-vgqng\" (UID: \"6b6de354-b085-4f66-ac6c-4eb6005aa965\") " pod="openshift-multus/multus-vgqng" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.347746 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/6b6de354-b085-4f66-ac6c-4eb6005aa965-host-var-lib-cni-multus\") pod \"multus-vgqng\" (UID: \"6b6de354-b085-4f66-ac6c-4eb6005aa965\") " pod="openshift-multus/multus-vgqng" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.347774 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c2f31724-195d-43c1-8048-188d3646ca61-system-cni-dir\") pod \"multus-additional-cni-plugins-q24vt\" (UID: \"c2f31724-195d-43c1-8048-188d3646ca61\") " pod="openshift-multus/multus-additional-cni-plugins-q24vt" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.347789 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-59z7g\" (UniqueName: \"kubernetes.io/projected/6b6de354-b085-4f66-ac6c-4eb6005aa965-kube-api-access-59z7g\") pod \"multus-vgqng\" (UID: \"6b6de354-b085-4f66-ac6c-4eb6005aa965\") " pod="openshift-multus/multus-vgqng" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.347805 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/6b6de354-b085-4f66-ac6c-4eb6005aa965-cni-binary-copy\") pod \"multus-vgqng\" (UID: \"6b6de354-b085-4f66-ac6c-4eb6005aa965\") " pod="openshift-multus/multus-vgqng" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.347820 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6b6de354-b085-4f66-ac6c-4eb6005aa965-host-var-lib-kubelet\") pod \"multus-vgqng\" (UID: \"6b6de354-b085-4f66-ac6c-4eb6005aa965\") " pod="openshift-multus/multus-vgqng" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.347837 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8vscm\" (UniqueName: \"kubernetes.io/projected/74f832fc-6791-47d6-a9b3-07d923e053dc-kube-api-access-8vscm\") pod \"machine-config-daemon-6b7d5\" (UID: \"74f832fc-6791-47d6-a9b3-07d923e053dc\") " pod="openshift-machine-config-operator/machine-config-daemon-6b7d5" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.347836 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/6b6de354-b085-4f66-ac6c-4eb6005aa965-host-var-lib-cni-multus\") pod \"multus-vgqng\" (UID: \"6b6de354-b085-4f66-ac6c-4eb6005aa965\") " pod="openshift-multus/multus-vgqng" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.347868 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c2f31724-195d-43c1-8048-188d3646ca61-cnibin\") pod \"multus-additional-cni-plugins-q24vt\" (UID: \"c2f31724-195d-43c1-8048-188d3646ca61\") " pod="openshift-multus/multus-additional-cni-plugins-q24vt" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.347670 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6b6de354-b085-4f66-ac6c-4eb6005aa965-host-var-lib-cni-bin\") pod \"multus-vgqng\" (UID: \"6b6de354-b085-4f66-ac6c-4eb6005aa965\") " pod="openshift-multus/multus-vgqng" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.347851 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c2f31724-195d-43c1-8048-188d3646ca61-cnibin\") pod \"multus-additional-cni-plugins-q24vt\" (UID: \"c2f31724-195d-43c1-8048-188d3646ca61\") " pod="openshift-multus/multus-additional-cni-plugins-q24vt" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.347899 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6b6de354-b085-4f66-ac6c-4eb6005aa965-system-cni-dir\") pod \"multus-vgqng\" (UID: \"6b6de354-b085-4f66-ac6c-4eb6005aa965\") " pod="openshift-multus/multus-vgqng" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.347923 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/6b6de354-b085-4f66-ac6c-4eb6005aa965-hostroot\") pod \"multus-vgqng\" (UID: \"6b6de354-b085-4f66-ac6c-4eb6005aa965\") " pod="openshift-multus/multus-vgqng" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.347942 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/74f832fc-6791-47d6-a9b3-07d923e053dc-rootfs\") pod \"machine-config-daemon-6b7d5\" (UID: \"74f832fc-6791-47d6-a9b3-07d923e053dc\") " pod="openshift-machine-config-operator/machine-config-daemon-6b7d5" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.347959 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6b6de354-b085-4f66-ac6c-4eb6005aa965-multus-cni-dir\") pod \"multus-vgqng\" (UID: \"6b6de354-b085-4f66-ac6c-4eb6005aa965\") " pod="openshift-multus/multus-vgqng" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.347974 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/6b6de354-b085-4f66-ac6c-4eb6005aa965-multus-daemon-config\") pod \"multus-vgqng\" (UID: \"6b6de354-b085-4f66-ac6c-4eb6005aa965\") " pod="openshift-multus/multus-vgqng" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.348117 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c2f31724-195d-43c1-8048-188d3646ca61-cni-binary-copy\") pod \"multus-additional-cni-plugins-q24vt\" (UID: \"c2f31724-195d-43c1-8048-188d3646ca61\") " pod="openshift-multus/multus-additional-cni-plugins-q24vt" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.348142 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/74f832fc-6791-47d6-a9b3-07d923e053dc-mcd-auth-proxy-config\") pod \"machine-config-daemon-6b7d5\" (UID: \"74f832fc-6791-47d6-a9b3-07d923e053dc\") " pod="openshift-machine-config-operator/machine-config-daemon-6b7d5" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.348174 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c2f31724-195d-43c1-8048-188d3646ca61-system-cni-dir\") pod \"multus-additional-cni-plugins-q24vt\" (UID: \"c2f31724-195d-43c1-8048-188d3646ca61\") " pod="openshift-multus/multus-additional-cni-plugins-q24vt" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.348392 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6b6de354-b085-4f66-ac6c-4eb6005aa965-etc-kubernetes\") pod \"multus-vgqng\" (UID: \"6b6de354-b085-4f66-ac6c-4eb6005aa965\") " pod="openshift-multus/multus-vgqng" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.348499 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6b6de354-b085-4f66-ac6c-4eb6005aa965-host-var-lib-kubelet\") pod \"multus-vgqng\" (UID: \"6b6de354-b085-4f66-ac6c-4eb6005aa965\") " pod="openshift-multus/multus-vgqng" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.348522 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/c2f31724-195d-43c1-8048-188d3646ca61-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-q24vt\" (UID: \"c2f31724-195d-43c1-8048-188d3646ca61\") " pod="openshift-multus/multus-additional-cni-plugins-q24vt" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.348527 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/6b6de354-b085-4f66-ac6c-4eb6005aa965-hostroot\") pod \"multus-vgqng\" (UID: \"6b6de354-b085-4f66-ac6c-4eb6005aa965\") " pod="openshift-multus/multus-vgqng" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.348554 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6b6de354-b085-4f66-ac6c-4eb6005aa965-system-cni-dir\") pod \"multus-vgqng\" (UID: \"6b6de354-b085-4f66-ac6c-4eb6005aa965\") " pod="openshift-multus/multus-vgqng" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.348580 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/74f832fc-6791-47d6-a9b3-07d923e053dc-rootfs\") pod \"machine-config-daemon-6b7d5\" (UID: \"74f832fc-6791-47d6-a9b3-07d923e053dc\") " pod="openshift-machine-config-operator/machine-config-daemon-6b7d5" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.348590 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/6b6de354-b085-4f66-ac6c-4eb6005aa965-multus-daemon-config\") pod \"multus-vgqng\" (UID: \"6b6de354-b085-4f66-ac6c-4eb6005aa965\") " pod="openshift-multus/multus-vgqng" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.348673 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6b6de354-b085-4f66-ac6c-4eb6005aa965-multus-cni-dir\") pod \"multus-vgqng\" (UID: \"6b6de354-b085-4f66-ac6c-4eb6005aa965\") " pod="openshift-multus/multus-vgqng" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.348727 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/6b6de354-b085-4f66-ac6c-4eb6005aa965-cni-binary-copy\") pod \"multus-vgqng\" (UID: \"6b6de354-b085-4f66-ac6c-4eb6005aa965\") " pod="openshift-multus/multus-vgqng" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.349825 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c2f31724-195d-43c1-8048-188d3646ca61-tuning-conf-dir\") pod \"multus-additional-cni-plugins-q24vt\" (UID: \"c2f31724-195d-43c1-8048-188d3646ca61\") " pod="openshift-multus/multus-additional-cni-plugins-q24vt" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.352150 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/74f832fc-6791-47d6-a9b3-07d923e053dc-proxy-tls\") pod \"machine-config-daemon-6b7d5\" (UID: \"74f832fc-6791-47d6-a9b3-07d923e053dc\") " pod="openshift-machine-config-operator/machine-config-daemon-6b7d5" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.371261 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-59z7g\" (UniqueName: \"kubernetes.io/projected/6b6de354-b085-4f66-ac6c-4eb6005aa965-kube-api-access-59z7g\") pod \"multus-vgqng\" (UID: \"6b6de354-b085-4f66-ac6c-4eb6005aa965\") " pod="openshift-multus/multus-vgqng" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.371806 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8vscm\" (UniqueName: \"kubernetes.io/projected/74f832fc-6791-47d6-a9b3-07d923e053dc-kube-api-access-8vscm\") pod \"machine-config-daemon-6b7d5\" (UID: \"74f832fc-6791-47d6-a9b3-07d923e053dc\") " pod="openshift-machine-config-operator/machine-config-daemon-6b7d5" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.372017 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pxxpm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf5ff297-df57-4520-a699-dd9f0d3eb7f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:35Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:35Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-974bn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T03:55:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pxxpm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.372366 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b9j4r\" (UniqueName: \"kubernetes.io/projected/c2f31724-195d-43c1-8048-188d3646ca61-kube-api-access-b9j4r\") pod \"multus-additional-cni-plugins-q24vt\" (UID: \"c2f31724-195d-43c1-8048-188d3646ca61\") " pod="openshift-multus/multus-additional-cni-plugins-q24vt" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.391239 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.399569 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.399607 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.399617 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.399632 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.399642 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T03:55:36Z","lastTransitionTime":"2025-10-11T03:55:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.400803 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6b7d5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"74f832fc-6791-47d6-a9b3-07d923e053dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8vscm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8vscm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T03:55:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6b7d5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.411937 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"899c5710-ccac-41a7-a51e-f13015a9b925\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d36cda809e465ae3cdbebd04ca1d0c880d982920b9e17e237e1c02da638b4b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa3ac63ddb093f123aeccd29dda14b9d42e3196818fe255e4315b2fd81e4d4bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://337d11e16f135a7996228291aa5aeb9dca3b0743c8d53ad358ad75a3f6ac604f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f1d0a644c02180fdd84bdf90488b6f2b54986bc290dc6ba0063e1ecb405800b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bdb20db3996ab197ac02919be77d320bf5f7ef288276b9f0788963b19a3d818\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1011 03:55:23.042098 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1011 03:55:23.055510 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1583286593/tls.crt::/tmp/serving-cert-1583286593/tls.key\\\\\\\"\\\\nI1011 03:55:28.420513 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1011 03:55:28.424215 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1011 03:55:28.424249 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1011 03:55:28.424289 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1011 03:55:28.424299 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1011 03:55:28.439370 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1011 03:55:28.439543 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1011 03:55:28.439608 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1011 03:55:28.439657 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1011 03:55:28.439755 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1011 03:55:28.439805 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1011 03:55:28.439852 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1011 03:55:28.439400 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1011 03:55:28.443344 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-11T03:55:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6665fdb421f1acf4f7a285ccb1adcdca9e7b6191f21ed8dd1ebfa21771a0cbc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbcccfdee4565b57a78cd594f8e68b9ee6f8afb6d329818c54058ef262ae1aa7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fbcccfdee4565b57a78cd594f8e68b9ee6f8afb6d329818c54058ef262ae1aa7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T03:55:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T03:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T03:55:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.423393 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.444468 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.482495 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7905d36e-5e82-411f-bbff-19fbde136cb3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c5b679c77d5f5c431108f51cdeecf410bd4f6f76c74c2580cd3c39898eec23e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8a0a051a04bd848843a5b40337140b947c17d8378b0efbcef7d507ecadcd40b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6798d26a9f6823db8508369a2228601bbce6ce6f60799acc543e3a9074321130\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4c454ab3f6ac3f4ddf483a0039e0ddcff628cc078dd736875b86becc2f51c0d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T03:55:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.498608 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.501627 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.501675 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.501687 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.501708 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.501724 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T03:55:36Z","lastTransitionTime":"2025-10-11T03:55:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.508738 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-6b7d5" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.518894 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-q24vt" Oct 11 03:55:36 crc kubenswrapper[4703]: W1011 03:55:36.519022 4703 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod74f832fc_6791_47d6_a9b3_07d923e053dc.slice/crio-583003919daf02cdc5437016932a4882fc07db28de9584975ef2a4986dd02745 WatchSource:0}: Error finding container 583003919daf02cdc5437016932a4882fc07db28de9584975ef2a4986dd02745: Status 404 returned error can't find the container with id 583003919daf02cdc5437016932a4882fc07db28de9584975ef2a4986dd02745 Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.519057 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.527188 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pxxpm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf5ff297-df57-4520-a699-dd9f0d3eb7f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:35Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:35Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-974bn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T03:55:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pxxpm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.529272 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-vgqng" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.532581 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 11 03:55:36 crc kubenswrapper[4703]: E1011 03:55:36.532673 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.532918 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 11 03:55:36 crc kubenswrapper[4703]: E1011 03:55:36.533164 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.532931 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 11 03:55:36 crc kubenswrapper[4703]: E1011 03:55:36.533439 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.539766 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.553563 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:36 crc kubenswrapper[4703]: W1011 03:55:36.561247 4703 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6b6de354_b085_4f66_ac6c_4eb6005aa965.slice/crio-c304aa7b98ea0f1b4429f9e0509ba99162f4b22c2e9fb0f76bc5a857f9a4c2bb WatchSource:0}: Error finding container c304aa7b98ea0f1b4429f9e0509ba99162f4b22c2e9fb0f76bc5a857f9a4c2bb: Status 404 returned error can't find the container with id c304aa7b98ea0f1b4429f9e0509ba99162f4b22c2e9fb0f76bc5a857f9a4c2bb Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.570132 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.582714 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-m4jc5"] Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.583540 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-m4jc5" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.588138 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.588352 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.588622 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.588822 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.588405 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.589019 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.596630 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.600363 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-q24vt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2f31724-195d-43c1-8048-188d3646ca61\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b9j4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b9j4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b9j4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b9j4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b9j4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b9j4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b9j4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T03:55:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-q24vt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.606366 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.606404 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.606416 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.606433 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.606444 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T03:55:36Z","lastTransitionTime":"2025-10-11T03:55:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.624571 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vgqng" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b6de354-b085-4f66-ac6c-4eb6005aa965\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-59z7g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T03:55:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vgqng\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.639299 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.651188 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8090d9aa-59c5-4c77-a4c0-94f2fa8d4426-host-run-netns\") pod \"ovnkube-node-m4jc5\" (UID: \"8090d9aa-59c5-4c77-a4c0-94f2fa8d4426\") " pod="openshift-ovn-kubernetes/ovnkube-node-m4jc5" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.651221 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/8090d9aa-59c5-4c77-a4c0-94f2fa8d4426-ovnkube-script-lib\") pod \"ovnkube-node-m4jc5\" (UID: \"8090d9aa-59c5-4c77-a4c0-94f2fa8d4426\") " pod="openshift-ovn-kubernetes/ovnkube-node-m4jc5" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.651239 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/8090d9aa-59c5-4c77-a4c0-94f2fa8d4426-host-slash\") pod \"ovnkube-node-m4jc5\" (UID: \"8090d9aa-59c5-4c77-a4c0-94f2fa8d4426\") " pod="openshift-ovn-kubernetes/ovnkube-node-m4jc5" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.651252 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/8090d9aa-59c5-4c77-a4c0-94f2fa8d4426-run-systemd\") pod \"ovnkube-node-m4jc5\" (UID: \"8090d9aa-59c5-4c77-a4c0-94f2fa8d4426\") " pod="openshift-ovn-kubernetes/ovnkube-node-m4jc5" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.651267 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/8090d9aa-59c5-4c77-a4c0-94f2fa8d4426-systemd-units\") pod \"ovnkube-node-m4jc5\" (UID: \"8090d9aa-59c5-4c77-a4c0-94f2fa8d4426\") " pod="openshift-ovn-kubernetes/ovnkube-node-m4jc5" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.651282 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8090d9aa-59c5-4c77-a4c0-94f2fa8d4426-host-cni-bin\") pod \"ovnkube-node-m4jc5\" (UID: \"8090d9aa-59c5-4c77-a4c0-94f2fa8d4426\") " pod="openshift-ovn-kubernetes/ovnkube-node-m4jc5" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.651306 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/8090d9aa-59c5-4c77-a4c0-94f2fa8d4426-ovn-node-metrics-cert\") pod \"ovnkube-node-m4jc5\" (UID: \"8090d9aa-59c5-4c77-a4c0-94f2fa8d4426\") " pod="openshift-ovn-kubernetes/ovnkube-node-m4jc5" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.651322 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8090d9aa-59c5-4c77-a4c0-94f2fa8d4426-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-m4jc5\" (UID: \"8090d9aa-59c5-4c77-a4c0-94f2fa8d4426\") " pod="openshift-ovn-kubernetes/ovnkube-node-m4jc5" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.651383 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/8090d9aa-59c5-4c77-a4c0-94f2fa8d4426-node-log\") pod \"ovnkube-node-m4jc5\" (UID: \"8090d9aa-59c5-4c77-a4c0-94f2fa8d4426\") " pod="openshift-ovn-kubernetes/ovnkube-node-m4jc5" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.651400 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8090d9aa-59c5-4c77-a4c0-94f2fa8d4426-host-run-ovn-kubernetes\") pod \"ovnkube-node-m4jc5\" (UID: \"8090d9aa-59c5-4c77-a4c0-94f2fa8d4426\") " pod="openshift-ovn-kubernetes/ovnkube-node-m4jc5" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.651422 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/8090d9aa-59c5-4c77-a4c0-94f2fa8d4426-ovnkube-config\") pod \"ovnkube-node-m4jc5\" (UID: \"8090d9aa-59c5-4c77-a4c0-94f2fa8d4426\") " pod="openshift-ovn-kubernetes/ovnkube-node-m4jc5" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.651436 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/8090d9aa-59c5-4c77-a4c0-94f2fa8d4426-run-ovn\") pod \"ovnkube-node-m4jc5\" (UID: \"8090d9aa-59c5-4c77-a4c0-94f2fa8d4426\") " pod="openshift-ovn-kubernetes/ovnkube-node-m4jc5" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.651450 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/8090d9aa-59c5-4c77-a4c0-94f2fa8d4426-log-socket\") pod \"ovnkube-node-m4jc5\" (UID: \"8090d9aa-59c5-4c77-a4c0-94f2fa8d4426\") " pod="openshift-ovn-kubernetes/ovnkube-node-m4jc5" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.651482 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8090d9aa-59c5-4c77-a4c0-94f2fa8d4426-etc-openvswitch\") pod \"ovnkube-node-m4jc5\" (UID: \"8090d9aa-59c5-4c77-a4c0-94f2fa8d4426\") " pod="openshift-ovn-kubernetes/ovnkube-node-m4jc5" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.651495 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/8090d9aa-59c5-4c77-a4c0-94f2fa8d4426-host-cni-netd\") pod \"ovnkube-node-m4jc5\" (UID: \"8090d9aa-59c5-4c77-a4c0-94f2fa8d4426\") " pod="openshift-ovn-kubernetes/ovnkube-node-m4jc5" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.651510 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8090d9aa-59c5-4c77-a4c0-94f2fa8d4426-var-lib-openvswitch\") pod \"ovnkube-node-m4jc5\" (UID: \"8090d9aa-59c5-4c77-a4c0-94f2fa8d4426\") " pod="openshift-ovn-kubernetes/ovnkube-node-m4jc5" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.651531 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8090d9aa-59c5-4c77-a4c0-94f2fa8d4426-run-openvswitch\") pod \"ovnkube-node-m4jc5\" (UID: \"8090d9aa-59c5-4c77-a4c0-94f2fa8d4426\") " pod="openshift-ovn-kubernetes/ovnkube-node-m4jc5" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.651547 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/8090d9aa-59c5-4c77-a4c0-94f2fa8d4426-host-kubelet\") pod \"ovnkube-node-m4jc5\" (UID: \"8090d9aa-59c5-4c77-a4c0-94f2fa8d4426\") " pod="openshift-ovn-kubernetes/ovnkube-node-m4jc5" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.651561 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/8090d9aa-59c5-4c77-a4c0-94f2fa8d4426-env-overrides\") pod \"ovnkube-node-m4jc5\" (UID: \"8090d9aa-59c5-4c77-a4c0-94f2fa8d4426\") " pod="openshift-ovn-kubernetes/ovnkube-node-m4jc5" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.651575 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jgh7c\" (UniqueName: \"kubernetes.io/projected/8090d9aa-59c5-4c77-a4c0-94f2fa8d4426-kube-api-access-jgh7c\") pod \"ovnkube-node-m4jc5\" (UID: \"8090d9aa-59c5-4c77-a4c0-94f2fa8d4426\") " pod="openshift-ovn-kubernetes/ovnkube-node-m4jc5" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.654053 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"899c5710-ccac-41a7-a51e-f13015a9b925\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d36cda809e465ae3cdbebd04ca1d0c880d982920b9e17e237e1c02da638b4b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa3ac63ddb093f123aeccd29dda14b9d42e3196818fe255e4315b2fd81e4d4bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://337d11e16f135a7996228291aa5aeb9dca3b0743c8d53ad358ad75a3f6ac604f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f1d0a644c02180fdd84bdf90488b6f2b54986bc290dc6ba0063e1ecb405800b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bdb20db3996ab197ac02919be77d320bf5f7ef288276b9f0788963b19a3d818\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1011 03:55:23.042098 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1011 03:55:23.055510 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1583286593/tls.crt::/tmp/serving-cert-1583286593/tls.key\\\\\\\"\\\\nI1011 03:55:28.420513 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1011 03:55:28.424215 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1011 03:55:28.424249 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1011 03:55:28.424289 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1011 03:55:28.424299 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1011 03:55:28.439370 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1011 03:55:28.439543 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1011 03:55:28.439608 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1011 03:55:28.439657 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1011 03:55:28.439755 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1011 03:55:28.439805 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1011 03:55:28.439852 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1011 03:55:28.439400 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1011 03:55:28.443344 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-11T03:55:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6665fdb421f1acf4f7a285ccb1adcdca9e7b6191f21ed8dd1ebfa21771a0cbc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbcccfdee4565b57a78cd594f8e68b9ee6f8afb6d329818c54058ef262ae1aa7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fbcccfdee4565b57a78cd594f8e68b9ee6f8afb6d329818c54058ef262ae1aa7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T03:55:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T03:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T03:55:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.670779 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6b7d5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"74f832fc-6791-47d6-a9b3-07d923e053dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8vscm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8vscm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T03:55:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6b7d5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.700971 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7905d36e-5e82-411f-bbff-19fbde136cb3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c5b679c77d5f5c431108f51cdeecf410bd4f6f76c74c2580cd3c39898eec23e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8a0a051a04bd848843a5b40337140b947c17d8378b0efbcef7d507ecadcd40b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6798d26a9f6823db8508369a2228601bbce6ce6f60799acc543e3a9074321130\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4c454ab3f6ac3f4ddf483a0039e0ddcff628cc078dd736875b86becc2f51c0d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T03:55:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.715283 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.715369 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.715416 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.715425 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.715439 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.715448 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T03:55:36Z","lastTransitionTime":"2025-10-11T03:55:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.726211 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.735219 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pxxpm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf5ff297-df57-4520-a699-dd9f0d3eb7f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:35Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:35Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-974bn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T03:55:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pxxpm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.738052 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-pxxpm" event={"ID":"cf5ff297-df57-4520-a699-dd9f0d3eb7f9","Type":"ContainerStarted","Data":"70b7c324a45c1721cf18751603441ea7e9c3f31d9286985ee1b27a838cdce1d6"} Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.738120 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-pxxpm" event={"ID":"cf5ff297-df57-4520-a699-dd9f0d3eb7f9","Type":"ContainerStarted","Data":"9dd6f79801ef3a07a93fb050e492a4cef62299cc65e29e16276f3d5b87a5c9a1"} Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.742282 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-q24vt" event={"ID":"c2f31724-195d-43c1-8048-188d3646ca61","Type":"ContainerStarted","Data":"415fabbd606084ec5156362c2af4319876cefb17513bed760c8eeb5099c5b7bb"} Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.745165 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-vgqng" event={"ID":"6b6de354-b085-4f66-ac6c-4eb6005aa965","Type":"ContainerStarted","Data":"d87e8dad79964dbc6e834c6da67c25f96267dc752ed1f3c8f1608ae7629d60e2"} Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.745194 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-vgqng" event={"ID":"6b6de354-b085-4f66-ac6c-4eb6005aa965","Type":"ContainerStarted","Data":"c304aa7b98ea0f1b4429f9e0509ba99162f4b22c2e9fb0f76bc5a857f9a4c2bb"} Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.745529 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.748394 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6b7d5" event={"ID":"74f832fc-6791-47d6-a9b3-07d923e053dc","Type":"ContainerStarted","Data":"a4f73efcf3bef36e6452025ae44770270a644dd02150c7a3e760ab13c1488d22"} Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.748443 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6b7d5" event={"ID":"74f832fc-6791-47d6-a9b3-07d923e053dc","Type":"ContainerStarted","Data":"583003919daf02cdc5437016932a4882fc07db28de9584975ef2a4986dd02745"} Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.751956 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/8090d9aa-59c5-4c77-a4c0-94f2fa8d4426-systemd-units\") pod \"ovnkube-node-m4jc5\" (UID: \"8090d9aa-59c5-4c77-a4c0-94f2fa8d4426\") " pod="openshift-ovn-kubernetes/ovnkube-node-m4jc5" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.751991 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8090d9aa-59c5-4c77-a4c0-94f2fa8d4426-host-cni-bin\") pod \"ovnkube-node-m4jc5\" (UID: \"8090d9aa-59c5-4c77-a4c0-94f2fa8d4426\") " pod="openshift-ovn-kubernetes/ovnkube-node-m4jc5" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.752008 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/8090d9aa-59c5-4c77-a4c0-94f2fa8d4426-ovn-node-metrics-cert\") pod \"ovnkube-node-m4jc5\" (UID: \"8090d9aa-59c5-4c77-a4c0-94f2fa8d4426\") " pod="openshift-ovn-kubernetes/ovnkube-node-m4jc5" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.752027 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8090d9aa-59c5-4c77-a4c0-94f2fa8d4426-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-m4jc5\" (UID: \"8090d9aa-59c5-4c77-a4c0-94f2fa8d4426\") " pod="openshift-ovn-kubernetes/ovnkube-node-m4jc5" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.752052 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/8090d9aa-59c5-4c77-a4c0-94f2fa8d4426-node-log\") pod \"ovnkube-node-m4jc5\" (UID: \"8090d9aa-59c5-4c77-a4c0-94f2fa8d4426\") " pod="openshift-ovn-kubernetes/ovnkube-node-m4jc5" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.752070 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8090d9aa-59c5-4c77-a4c0-94f2fa8d4426-host-run-ovn-kubernetes\") pod \"ovnkube-node-m4jc5\" (UID: \"8090d9aa-59c5-4c77-a4c0-94f2fa8d4426\") " pod="openshift-ovn-kubernetes/ovnkube-node-m4jc5" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.752069 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/8090d9aa-59c5-4c77-a4c0-94f2fa8d4426-systemd-units\") pod \"ovnkube-node-m4jc5\" (UID: \"8090d9aa-59c5-4c77-a4c0-94f2fa8d4426\") " pod="openshift-ovn-kubernetes/ovnkube-node-m4jc5" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.752102 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/8090d9aa-59c5-4c77-a4c0-94f2fa8d4426-ovnkube-config\") pod \"ovnkube-node-m4jc5\" (UID: \"8090d9aa-59c5-4c77-a4c0-94f2fa8d4426\") " pod="openshift-ovn-kubernetes/ovnkube-node-m4jc5" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.752118 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/8090d9aa-59c5-4c77-a4c0-94f2fa8d4426-run-ovn\") pod \"ovnkube-node-m4jc5\" (UID: \"8090d9aa-59c5-4c77-a4c0-94f2fa8d4426\") " pod="openshift-ovn-kubernetes/ovnkube-node-m4jc5" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.752142 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8090d9aa-59c5-4c77-a4c0-94f2fa8d4426-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-m4jc5\" (UID: \"8090d9aa-59c5-4c77-a4c0-94f2fa8d4426\") " pod="openshift-ovn-kubernetes/ovnkube-node-m4jc5" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.752137 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/8090d9aa-59c5-4c77-a4c0-94f2fa8d4426-log-socket\") pod \"ovnkube-node-m4jc5\" (UID: \"8090d9aa-59c5-4c77-a4c0-94f2fa8d4426\") " pod="openshift-ovn-kubernetes/ovnkube-node-m4jc5" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.752173 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8090d9aa-59c5-4c77-a4c0-94f2fa8d4426-var-lib-openvswitch\") pod \"ovnkube-node-m4jc5\" (UID: \"8090d9aa-59c5-4c77-a4c0-94f2fa8d4426\") " pod="openshift-ovn-kubernetes/ovnkube-node-m4jc5" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.752190 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8090d9aa-59c5-4c77-a4c0-94f2fa8d4426-etc-openvswitch\") pod \"ovnkube-node-m4jc5\" (UID: \"8090d9aa-59c5-4c77-a4c0-94f2fa8d4426\") " pod="openshift-ovn-kubernetes/ovnkube-node-m4jc5" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.752192 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/8090d9aa-59c5-4c77-a4c0-94f2fa8d4426-log-socket\") pod \"ovnkube-node-m4jc5\" (UID: \"8090d9aa-59c5-4c77-a4c0-94f2fa8d4426\") " pod="openshift-ovn-kubernetes/ovnkube-node-m4jc5" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.752171 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8090d9aa-59c5-4c77-a4c0-94f2fa8d4426-host-run-ovn-kubernetes\") pod \"ovnkube-node-m4jc5\" (UID: \"8090d9aa-59c5-4c77-a4c0-94f2fa8d4426\") " pod="openshift-ovn-kubernetes/ovnkube-node-m4jc5" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.752207 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/8090d9aa-59c5-4c77-a4c0-94f2fa8d4426-host-cni-netd\") pod \"ovnkube-node-m4jc5\" (UID: \"8090d9aa-59c5-4c77-a4c0-94f2fa8d4426\") " pod="openshift-ovn-kubernetes/ovnkube-node-m4jc5" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.752227 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8090d9aa-59c5-4c77-a4c0-94f2fa8d4426-var-lib-openvswitch\") pod \"ovnkube-node-m4jc5\" (UID: \"8090d9aa-59c5-4c77-a4c0-94f2fa8d4426\") " pod="openshift-ovn-kubernetes/ovnkube-node-m4jc5" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.752225 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/8090d9aa-59c5-4c77-a4c0-94f2fa8d4426-run-ovn\") pod \"ovnkube-node-m4jc5\" (UID: \"8090d9aa-59c5-4c77-a4c0-94f2fa8d4426\") " pod="openshift-ovn-kubernetes/ovnkube-node-m4jc5" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.752253 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8090d9aa-59c5-4c77-a4c0-94f2fa8d4426-etc-openvswitch\") pod \"ovnkube-node-m4jc5\" (UID: \"8090d9aa-59c5-4c77-a4c0-94f2fa8d4426\") " pod="openshift-ovn-kubernetes/ovnkube-node-m4jc5" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.752257 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8090d9aa-59c5-4c77-a4c0-94f2fa8d4426-run-openvswitch\") pod \"ovnkube-node-m4jc5\" (UID: \"8090d9aa-59c5-4c77-a4c0-94f2fa8d4426\") " pod="openshift-ovn-kubernetes/ovnkube-node-m4jc5" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.752230 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8090d9aa-59c5-4c77-a4c0-94f2fa8d4426-run-openvswitch\") pod \"ovnkube-node-m4jc5\" (UID: \"8090d9aa-59c5-4c77-a4c0-94f2fa8d4426\") " pod="openshift-ovn-kubernetes/ovnkube-node-m4jc5" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.752122 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/8090d9aa-59c5-4c77-a4c0-94f2fa8d4426-node-log\") pod \"ovnkube-node-m4jc5\" (UID: \"8090d9aa-59c5-4c77-a4c0-94f2fa8d4426\") " pod="openshift-ovn-kubernetes/ovnkube-node-m4jc5" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.752300 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8090d9aa-59c5-4c77-a4c0-94f2fa8d4426-host-cni-bin\") pod \"ovnkube-node-m4jc5\" (UID: \"8090d9aa-59c5-4c77-a4c0-94f2fa8d4426\") " pod="openshift-ovn-kubernetes/ovnkube-node-m4jc5" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.752320 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/8090d9aa-59c5-4c77-a4c0-94f2fa8d4426-host-kubelet\") pod \"ovnkube-node-m4jc5\" (UID: \"8090d9aa-59c5-4c77-a4c0-94f2fa8d4426\") " pod="openshift-ovn-kubernetes/ovnkube-node-m4jc5" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.752352 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/8090d9aa-59c5-4c77-a4c0-94f2fa8d4426-host-cni-netd\") pod \"ovnkube-node-m4jc5\" (UID: \"8090d9aa-59c5-4c77-a4c0-94f2fa8d4426\") " pod="openshift-ovn-kubernetes/ovnkube-node-m4jc5" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.752370 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/8090d9aa-59c5-4c77-a4c0-94f2fa8d4426-env-overrides\") pod \"ovnkube-node-m4jc5\" (UID: \"8090d9aa-59c5-4c77-a4c0-94f2fa8d4426\") " pod="openshift-ovn-kubernetes/ovnkube-node-m4jc5" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.752583 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/8090d9aa-59c5-4c77-a4c0-94f2fa8d4426-host-kubelet\") pod \"ovnkube-node-m4jc5\" (UID: \"8090d9aa-59c5-4c77-a4c0-94f2fa8d4426\") " pod="openshift-ovn-kubernetes/ovnkube-node-m4jc5" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.752656 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jgh7c\" (UniqueName: \"kubernetes.io/projected/8090d9aa-59c5-4c77-a4c0-94f2fa8d4426-kube-api-access-jgh7c\") pod \"ovnkube-node-m4jc5\" (UID: \"8090d9aa-59c5-4c77-a4c0-94f2fa8d4426\") " pod="openshift-ovn-kubernetes/ovnkube-node-m4jc5" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.752703 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8090d9aa-59c5-4c77-a4c0-94f2fa8d4426-host-run-netns\") pod \"ovnkube-node-m4jc5\" (UID: \"8090d9aa-59c5-4c77-a4c0-94f2fa8d4426\") " pod="openshift-ovn-kubernetes/ovnkube-node-m4jc5" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.752724 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/8090d9aa-59c5-4c77-a4c0-94f2fa8d4426-ovnkube-script-lib\") pod \"ovnkube-node-m4jc5\" (UID: \"8090d9aa-59c5-4c77-a4c0-94f2fa8d4426\") " pod="openshift-ovn-kubernetes/ovnkube-node-m4jc5" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.752747 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/8090d9aa-59c5-4c77-a4c0-94f2fa8d4426-host-slash\") pod \"ovnkube-node-m4jc5\" (UID: \"8090d9aa-59c5-4c77-a4c0-94f2fa8d4426\") " pod="openshift-ovn-kubernetes/ovnkube-node-m4jc5" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.752766 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/8090d9aa-59c5-4c77-a4c0-94f2fa8d4426-run-systemd\") pod \"ovnkube-node-m4jc5\" (UID: \"8090d9aa-59c5-4c77-a4c0-94f2fa8d4426\") " pod="openshift-ovn-kubernetes/ovnkube-node-m4jc5" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.752832 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/8090d9aa-59c5-4c77-a4c0-94f2fa8d4426-run-systemd\") pod \"ovnkube-node-m4jc5\" (UID: \"8090d9aa-59c5-4c77-a4c0-94f2fa8d4426\") " pod="openshift-ovn-kubernetes/ovnkube-node-m4jc5" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.752863 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/8090d9aa-59c5-4c77-a4c0-94f2fa8d4426-env-overrides\") pod \"ovnkube-node-m4jc5\" (UID: \"8090d9aa-59c5-4c77-a4c0-94f2fa8d4426\") " pod="openshift-ovn-kubernetes/ovnkube-node-m4jc5" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.752905 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/8090d9aa-59c5-4c77-a4c0-94f2fa8d4426-host-slash\") pod \"ovnkube-node-m4jc5\" (UID: \"8090d9aa-59c5-4c77-a4c0-94f2fa8d4426\") " pod="openshift-ovn-kubernetes/ovnkube-node-m4jc5" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.752908 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/8090d9aa-59c5-4c77-a4c0-94f2fa8d4426-ovnkube-config\") pod \"ovnkube-node-m4jc5\" (UID: \"8090d9aa-59c5-4c77-a4c0-94f2fa8d4426\") " pod="openshift-ovn-kubernetes/ovnkube-node-m4jc5" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.752928 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8090d9aa-59c5-4c77-a4c0-94f2fa8d4426-host-run-netns\") pod \"ovnkube-node-m4jc5\" (UID: \"8090d9aa-59c5-4c77-a4c0-94f2fa8d4426\") " pod="openshift-ovn-kubernetes/ovnkube-node-m4jc5" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.753384 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/8090d9aa-59c5-4c77-a4c0-94f2fa8d4426-ovnkube-script-lib\") pod \"ovnkube-node-m4jc5\" (UID: \"8090d9aa-59c5-4c77-a4c0-94f2fa8d4426\") " pod="openshift-ovn-kubernetes/ovnkube-node-m4jc5" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.756700 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/8090d9aa-59c5-4c77-a4c0-94f2fa8d4426-ovn-node-metrics-cert\") pod \"ovnkube-node-m4jc5\" (UID: \"8090d9aa-59c5-4c77-a4c0-94f2fa8d4426\") " pod="openshift-ovn-kubernetes/ovnkube-node-m4jc5" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.758816 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.768828 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jgh7c\" (UniqueName: \"kubernetes.io/projected/8090d9aa-59c5-4c77-a4c0-94f2fa8d4426-kube-api-access-jgh7c\") pod \"ovnkube-node-m4jc5\" (UID: \"8090d9aa-59c5-4c77-a4c0-94f2fa8d4426\") " pod="openshift-ovn-kubernetes/ovnkube-node-m4jc5" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.769887 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.781205 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-q24vt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2f31724-195d-43c1-8048-188d3646ca61\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b9j4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b9j4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b9j4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b9j4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b9j4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b9j4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b9j4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T03:55:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-q24vt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.789728 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vgqng" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b6de354-b085-4f66-ac6c-4eb6005aa965\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-59z7g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T03:55:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vgqng\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.800444 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.811383 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"899c5710-ccac-41a7-a51e-f13015a9b925\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d36cda809e465ae3cdbebd04ca1d0c880d982920b9e17e237e1c02da638b4b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa3ac63ddb093f123aeccd29dda14b9d42e3196818fe255e4315b2fd81e4d4bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://337d11e16f135a7996228291aa5aeb9dca3b0743c8d53ad358ad75a3f6ac604f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f1d0a644c02180fdd84bdf90488b6f2b54986bc290dc6ba0063e1ecb405800b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bdb20db3996ab197ac02919be77d320bf5f7ef288276b9f0788963b19a3d818\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1011 03:55:23.042098 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1011 03:55:23.055510 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1583286593/tls.crt::/tmp/serving-cert-1583286593/tls.key\\\\\\\"\\\\nI1011 03:55:28.420513 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1011 03:55:28.424215 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1011 03:55:28.424249 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1011 03:55:28.424289 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1011 03:55:28.424299 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1011 03:55:28.439370 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1011 03:55:28.439543 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1011 03:55:28.439608 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1011 03:55:28.439657 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1011 03:55:28.439755 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1011 03:55:28.439805 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1011 03:55:28.439852 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1011 03:55:28.439400 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1011 03:55:28.443344 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-11T03:55:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6665fdb421f1acf4f7a285ccb1adcdca9e7b6191f21ed8dd1ebfa21771a0cbc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbcccfdee4565b57a78cd594f8e68b9ee6f8afb6d329818c54058ef262ae1aa7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fbcccfdee4565b57a78cd594f8e68b9ee6f8afb6d329818c54058ef262ae1aa7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T03:55:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T03:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T03:55:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.817786 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.817829 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.817841 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.817861 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.817877 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T03:55:36Z","lastTransitionTime":"2025-10-11T03:55:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.822465 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6b7d5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"74f832fc-6791-47d6-a9b3-07d923e053dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8vscm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8vscm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T03:55:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6b7d5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.839454 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m4jc5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8090d9aa-59c5-4c77-a4c0-94f2fa8d4426\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgh7c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgh7c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgh7c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgh7c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgh7c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgh7c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgh7c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgh7c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgh7c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T03:55:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-m4jc5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.853402 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6b7d5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"74f832fc-6791-47d6-a9b3-07d923e053dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f96945fd0f7582822795213d47459caf01faf1e4d2a2edb5112939558c438969\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8vscm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4f73efcf3bef36e6452025ae44770270a644dd02150c7a3e760ab13c1488d22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8vscm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T03:55:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6b7d5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.873978 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m4jc5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8090d9aa-59c5-4c77-a4c0-94f2fa8d4426\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgh7c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgh7c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgh7c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgh7c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgh7c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgh7c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgh7c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgh7c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgh7c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T03:55:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-m4jc5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.887311 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"899c5710-ccac-41a7-a51e-f13015a9b925\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d36cda809e465ae3cdbebd04ca1d0c880d982920b9e17e237e1c02da638b4b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa3ac63ddb093f123aeccd29dda14b9d42e3196818fe255e4315b2fd81e4d4bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://337d11e16f135a7996228291aa5aeb9dca3b0743c8d53ad358ad75a3f6ac604f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f1d0a644c02180fdd84bdf90488b6f2b54986bc290dc6ba0063e1ecb405800b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bdb20db3996ab197ac02919be77d320bf5f7ef288276b9f0788963b19a3d818\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1011 03:55:23.042098 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1011 03:55:23.055510 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1583286593/tls.crt::/tmp/serving-cert-1583286593/tls.key\\\\\\\"\\\\nI1011 03:55:28.420513 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1011 03:55:28.424215 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1011 03:55:28.424249 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1011 03:55:28.424289 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1011 03:55:28.424299 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1011 03:55:28.439370 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1011 03:55:28.439543 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1011 03:55:28.439608 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1011 03:55:28.439657 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1011 03:55:28.439755 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1011 03:55:28.439805 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1011 03:55:28.439852 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1011 03:55:28.439400 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1011 03:55:28.443344 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-11T03:55:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6665fdb421f1acf4f7a285ccb1adcdca9e7b6191f21ed8dd1ebfa21771a0cbc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbcccfdee4565b57a78cd594f8e68b9ee6f8afb6d329818c54058ef262ae1aa7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fbcccfdee4565b57a78cd594f8e68b9ee6f8afb6d329818c54058ef262ae1aa7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T03:55:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T03:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T03:55:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.898903 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7905d36e-5e82-411f-bbff-19fbde136cb3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c5b679c77d5f5c431108f51cdeecf410bd4f6f76c74c2580cd3c39898eec23e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8a0a051a04bd848843a5b40337140b947c17d8378b0efbcef7d507ecadcd40b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6798d26a9f6823db8508369a2228601bbce6ce6f60799acc543e3a9074321130\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4c454ab3f6ac3f4ddf483a0039e0ddcff628cc078dd736875b86becc2f51c0d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T03:55:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.908321 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.916561 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.920056 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.920101 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.920138 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.920161 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.920173 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T03:55:36Z","lastTransitionTime":"2025-10-11T03:55:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.922337 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-m4jc5" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.925498 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pxxpm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf5ff297-df57-4520-a699-dd9f0d3eb7f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70b7c324a45c1721cf18751603441ea7e9c3f31d9286985ee1b27a838cdce1d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-974bn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T03:55:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pxxpm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:36 crc kubenswrapper[4703]: W1011 03:55:36.935810 4703 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8090d9aa_59c5_4c77_a4c0_94f2fa8d4426.slice/crio-377cc690dc6da19f1109e489b1b681e74bce4e48325c37bd421da64d645ae93b WatchSource:0}: Error finding container 377cc690dc6da19f1109e489b1b681e74bce4e48325c37bd421da64d645ae93b: Status 404 returned error can't find the container with id 377cc690dc6da19f1109e489b1b681e74bce4e48325c37bd421da64d645ae93b Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.936159 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.949707 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:36 crc kubenswrapper[4703]: I1011 03:55:36.988566 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-q24vt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2f31724-195d-43c1-8048-188d3646ca61\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b9j4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b9j4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b9j4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b9j4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b9j4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b9j4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b9j4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T03:55:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-q24vt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:37 crc kubenswrapper[4703]: I1011 03:55:37.009534 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vgqng" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b6de354-b085-4f66-ac6c-4eb6005aa965\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d87e8dad79964dbc6e834c6da67c25f96267dc752ed1f3c8f1608ae7629d60e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-59z7g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T03:55:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vgqng\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:37 crc kubenswrapper[4703]: I1011 03:55:37.023783 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 03:55:37 crc kubenswrapper[4703]: I1011 03:55:37.023857 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 03:55:37 crc kubenswrapper[4703]: I1011 03:55:37.023870 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 03:55:37 crc kubenswrapper[4703]: I1011 03:55:37.023892 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 03:55:37 crc kubenswrapper[4703]: I1011 03:55:37.023904 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T03:55:37Z","lastTransitionTime":"2025-10-11T03:55:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 03:55:37 crc kubenswrapper[4703]: I1011 03:55:37.040264 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:37 crc kubenswrapper[4703]: I1011 03:55:37.057770 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:37 crc kubenswrapper[4703]: I1011 03:55:37.126365 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 03:55:37 crc kubenswrapper[4703]: I1011 03:55:37.126397 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 03:55:37 crc kubenswrapper[4703]: I1011 03:55:37.126407 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 03:55:37 crc kubenswrapper[4703]: I1011 03:55:37.126422 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 03:55:37 crc kubenswrapper[4703]: I1011 03:55:37.126433 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T03:55:37Z","lastTransitionTime":"2025-10-11T03:55:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 03:55:37 crc kubenswrapper[4703]: I1011 03:55:37.229213 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 03:55:37 crc kubenswrapper[4703]: I1011 03:55:37.229248 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 03:55:37 crc kubenswrapper[4703]: I1011 03:55:37.229259 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 03:55:37 crc kubenswrapper[4703]: I1011 03:55:37.229274 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 03:55:37 crc kubenswrapper[4703]: I1011 03:55:37.229294 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T03:55:37Z","lastTransitionTime":"2025-10-11T03:55:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 03:55:37 crc kubenswrapper[4703]: I1011 03:55:37.332185 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 03:55:37 crc kubenswrapper[4703]: I1011 03:55:37.332245 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 03:55:37 crc kubenswrapper[4703]: I1011 03:55:37.332263 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 03:55:37 crc kubenswrapper[4703]: I1011 03:55:37.332288 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 03:55:37 crc kubenswrapper[4703]: I1011 03:55:37.332306 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T03:55:37Z","lastTransitionTime":"2025-10-11T03:55:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 03:55:37 crc kubenswrapper[4703]: I1011 03:55:37.434443 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 03:55:37 crc kubenswrapper[4703]: I1011 03:55:37.434526 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 03:55:37 crc kubenswrapper[4703]: I1011 03:55:37.434540 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 03:55:37 crc kubenswrapper[4703]: I1011 03:55:37.434566 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 03:55:37 crc kubenswrapper[4703]: I1011 03:55:37.434578 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T03:55:37Z","lastTransitionTime":"2025-10-11T03:55:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 03:55:37 crc kubenswrapper[4703]: I1011 03:55:37.537347 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 03:55:37 crc kubenswrapper[4703]: I1011 03:55:37.537419 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 03:55:37 crc kubenswrapper[4703]: I1011 03:55:37.537443 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 03:55:37 crc kubenswrapper[4703]: I1011 03:55:37.537506 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 03:55:37 crc kubenswrapper[4703]: I1011 03:55:37.537532 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T03:55:37Z","lastTransitionTime":"2025-10-11T03:55:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 03:55:37 crc kubenswrapper[4703]: I1011 03:55:37.640279 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 03:55:37 crc kubenswrapper[4703]: I1011 03:55:37.641190 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 03:55:37 crc kubenswrapper[4703]: I1011 03:55:37.641221 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 03:55:37 crc kubenswrapper[4703]: I1011 03:55:37.641241 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 03:55:37 crc kubenswrapper[4703]: I1011 03:55:37.641255 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T03:55:37Z","lastTransitionTime":"2025-10-11T03:55:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 03:55:37 crc kubenswrapper[4703]: I1011 03:55:37.702119 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-59fkr"] Oct 11 03:55:37 crc kubenswrapper[4703]: I1011 03:55:37.702666 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-59fkr" Oct 11 03:55:37 crc kubenswrapper[4703]: I1011 03:55:37.705802 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Oct 11 03:55:37 crc kubenswrapper[4703]: I1011 03:55:37.706349 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Oct 11 03:55:37 crc kubenswrapper[4703]: I1011 03:55:37.708865 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Oct 11 03:55:37 crc kubenswrapper[4703]: I1011 03:55:37.709137 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Oct 11 03:55:37 crc kubenswrapper[4703]: I1011 03:55:37.714662 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pxxpm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf5ff297-df57-4520-a699-dd9f0d3eb7f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70b7c324a45c1721cf18751603441ea7e9c3f31d9286985ee1b27a838cdce1d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-974bn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T03:55:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pxxpm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:37 crc kubenswrapper[4703]: I1011 03:55:37.726222 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7905d36e-5e82-411f-bbff-19fbde136cb3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c5b679c77d5f5c431108f51cdeecf410bd4f6f76c74c2580cd3c39898eec23e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8a0a051a04bd848843a5b40337140b947c17d8378b0efbcef7d507ecadcd40b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6798d26a9f6823db8508369a2228601bbce6ce6f60799acc543e3a9074321130\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4c454ab3f6ac3f4ddf483a0039e0ddcff628cc078dd736875b86becc2f51c0d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T03:55:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:37 crc kubenswrapper[4703]: I1011 03:55:37.745623 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:37 crc kubenswrapper[4703]: I1011 03:55:37.745704 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 03:55:37 crc kubenswrapper[4703]: I1011 03:55:37.746290 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 03:55:37 crc kubenswrapper[4703]: I1011 03:55:37.746311 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 03:55:37 crc kubenswrapper[4703]: I1011 03:55:37.746334 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 03:55:37 crc kubenswrapper[4703]: I1011 03:55:37.746391 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T03:55:37Z","lastTransitionTime":"2025-10-11T03:55:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 03:55:37 crc kubenswrapper[4703]: I1011 03:55:37.753067 4703 generic.go:334] "Generic (PLEG): container finished" podID="c2f31724-195d-43c1-8048-188d3646ca61" containerID="62485da5c000bd8c6573a0f058d3fd2a015e6770f02b7f5421b06ca4397fbd1f" exitCode=0 Oct 11 03:55:37 crc kubenswrapper[4703]: I1011 03:55:37.753193 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-q24vt" event={"ID":"c2f31724-195d-43c1-8048-188d3646ca61","Type":"ContainerDied","Data":"62485da5c000bd8c6573a0f058d3fd2a015e6770f02b7f5421b06ca4397fbd1f"} Oct 11 03:55:37 crc kubenswrapper[4703]: I1011 03:55:37.754883 4703 generic.go:334] "Generic (PLEG): container finished" podID="8090d9aa-59c5-4c77-a4c0-94f2fa8d4426" containerID="295892d24c4de9ffda31b2be4cb00fdf1a108088677abebd05b7ab98c0f722f5" exitCode=0 Oct 11 03:55:37 crc kubenswrapper[4703]: I1011 03:55:37.754955 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m4jc5" event={"ID":"8090d9aa-59c5-4c77-a4c0-94f2fa8d4426","Type":"ContainerDied","Data":"295892d24c4de9ffda31b2be4cb00fdf1a108088677abebd05b7ab98c0f722f5"} Oct 11 03:55:37 crc kubenswrapper[4703]: I1011 03:55:37.754982 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m4jc5" event={"ID":"8090d9aa-59c5-4c77-a4c0-94f2fa8d4426","Type":"ContainerStarted","Data":"377cc690dc6da19f1109e489b1b681e74bce4e48325c37bd421da64d645ae93b"} Oct 11 03:55:37 crc kubenswrapper[4703]: I1011 03:55:37.758122 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6b7d5" event={"ID":"74f832fc-6791-47d6-a9b3-07d923e053dc","Type":"ContainerStarted","Data":"f96945fd0f7582822795213d47459caf01faf1e4d2a2edb5112939558c438969"} Oct 11 03:55:37 crc kubenswrapper[4703]: I1011 03:55:37.762084 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:37 crc kubenswrapper[4703]: I1011 03:55:37.766018 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/8ee03a3c-bf16-41a7-b901-a5523ccfd389-serviceca\") pod \"node-ca-59fkr\" (UID: \"8ee03a3c-bf16-41a7-b901-a5523ccfd389\") " pod="openshift-image-registry/node-ca-59fkr" Oct 11 03:55:37 crc kubenswrapper[4703]: I1011 03:55:37.766158 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nqf2q\" (UniqueName: \"kubernetes.io/projected/8ee03a3c-bf16-41a7-b901-a5523ccfd389-kube-api-access-nqf2q\") pod \"node-ca-59fkr\" (UID: \"8ee03a3c-bf16-41a7-b901-a5523ccfd389\") " pod="openshift-image-registry/node-ca-59fkr" Oct 11 03:55:37 crc kubenswrapper[4703]: I1011 03:55:37.766213 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8ee03a3c-bf16-41a7-b901-a5523ccfd389-host\") pod \"node-ca-59fkr\" (UID: \"8ee03a3c-bf16-41a7-b901-a5523ccfd389\") " pod="openshift-image-registry/node-ca-59fkr" Oct 11 03:55:37 crc kubenswrapper[4703]: I1011 03:55:37.780127 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vgqng" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b6de354-b085-4f66-ac6c-4eb6005aa965\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d87e8dad79964dbc6e834c6da67c25f96267dc752ed1f3c8f1608ae7629d60e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-59z7g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T03:55:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vgqng\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:37 crc kubenswrapper[4703]: I1011 03:55:37.794610 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:37 crc kubenswrapper[4703]: I1011 03:55:37.807336 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:37 crc kubenswrapper[4703]: I1011 03:55:37.821690 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:37 crc kubenswrapper[4703]: I1011 03:55:37.836665 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-q24vt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2f31724-195d-43c1-8048-188d3646ca61\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b9j4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b9j4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b9j4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b9j4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b9j4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b9j4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b9j4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T03:55:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-q24vt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:37 crc kubenswrapper[4703]: I1011 03:55:37.848708 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:37 crc kubenswrapper[4703]: I1011 03:55:37.849982 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 03:55:37 crc kubenswrapper[4703]: I1011 03:55:37.850024 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 03:55:37 crc kubenswrapper[4703]: I1011 03:55:37.850036 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 03:55:37 crc kubenswrapper[4703]: I1011 03:55:37.850055 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 03:55:37 crc kubenswrapper[4703]: I1011 03:55:37.850070 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T03:55:37Z","lastTransitionTime":"2025-10-11T03:55:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 03:55:37 crc kubenswrapper[4703]: I1011 03:55:37.859869 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-59fkr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8ee03a3c-bf16-41a7-b901-a5523ccfd389\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:37Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:37Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nqf2q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T03:55:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-59fkr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:37 crc kubenswrapper[4703]: I1011 03:55:37.867546 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/8ee03a3c-bf16-41a7-b901-a5523ccfd389-serviceca\") pod \"node-ca-59fkr\" (UID: \"8ee03a3c-bf16-41a7-b901-a5523ccfd389\") " pod="openshift-image-registry/node-ca-59fkr" Oct 11 03:55:37 crc kubenswrapper[4703]: I1011 03:55:37.867672 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nqf2q\" (UniqueName: \"kubernetes.io/projected/8ee03a3c-bf16-41a7-b901-a5523ccfd389-kube-api-access-nqf2q\") pod \"node-ca-59fkr\" (UID: \"8ee03a3c-bf16-41a7-b901-a5523ccfd389\") " pod="openshift-image-registry/node-ca-59fkr" Oct 11 03:55:37 crc kubenswrapper[4703]: I1011 03:55:37.867704 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8ee03a3c-bf16-41a7-b901-a5523ccfd389-host\") pod \"node-ca-59fkr\" (UID: \"8ee03a3c-bf16-41a7-b901-a5523ccfd389\") " pod="openshift-image-registry/node-ca-59fkr" Oct 11 03:55:37 crc kubenswrapper[4703]: I1011 03:55:37.868030 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8ee03a3c-bf16-41a7-b901-a5523ccfd389-host\") pod \"node-ca-59fkr\" (UID: \"8ee03a3c-bf16-41a7-b901-a5523ccfd389\") " pod="openshift-image-registry/node-ca-59fkr" Oct 11 03:55:37 crc kubenswrapper[4703]: I1011 03:55:37.869625 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/8ee03a3c-bf16-41a7-b901-a5523ccfd389-serviceca\") pod \"node-ca-59fkr\" (UID: \"8ee03a3c-bf16-41a7-b901-a5523ccfd389\") " pod="openshift-image-registry/node-ca-59fkr" Oct 11 03:55:37 crc kubenswrapper[4703]: I1011 03:55:37.874097 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"899c5710-ccac-41a7-a51e-f13015a9b925\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d36cda809e465ae3cdbebd04ca1d0c880d982920b9e17e237e1c02da638b4b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa3ac63ddb093f123aeccd29dda14b9d42e3196818fe255e4315b2fd81e4d4bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://337d11e16f135a7996228291aa5aeb9dca3b0743c8d53ad358ad75a3f6ac604f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f1d0a644c02180fdd84bdf90488b6f2b54986bc290dc6ba0063e1ecb405800b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bdb20db3996ab197ac02919be77d320bf5f7ef288276b9f0788963b19a3d818\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1011 03:55:23.042098 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1011 03:55:23.055510 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1583286593/tls.crt::/tmp/serving-cert-1583286593/tls.key\\\\\\\"\\\\nI1011 03:55:28.420513 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1011 03:55:28.424215 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1011 03:55:28.424249 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1011 03:55:28.424289 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1011 03:55:28.424299 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1011 03:55:28.439370 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1011 03:55:28.439543 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1011 03:55:28.439608 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1011 03:55:28.439657 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1011 03:55:28.439755 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1011 03:55:28.439805 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1011 03:55:28.439852 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1011 03:55:28.439400 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1011 03:55:28.443344 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-11T03:55:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6665fdb421f1acf4f7a285ccb1adcdca9e7b6191f21ed8dd1ebfa21771a0cbc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbcccfdee4565b57a78cd594f8e68b9ee6f8afb6d329818c54058ef262ae1aa7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fbcccfdee4565b57a78cd594f8e68b9ee6f8afb6d329818c54058ef262ae1aa7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T03:55:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T03:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T03:55:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:37 crc kubenswrapper[4703]: I1011 03:55:37.882657 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6b7d5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"74f832fc-6791-47d6-a9b3-07d923e053dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f96945fd0f7582822795213d47459caf01faf1e4d2a2edb5112939558c438969\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8vscm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4f73efcf3bef36e6452025ae44770270a644dd02150c7a3e760ab13c1488d22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8vscm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T03:55:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6b7d5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:37 crc kubenswrapper[4703]: I1011 03:55:37.889976 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nqf2q\" (UniqueName: \"kubernetes.io/projected/8ee03a3c-bf16-41a7-b901-a5523ccfd389-kube-api-access-nqf2q\") pod \"node-ca-59fkr\" (UID: \"8ee03a3c-bf16-41a7-b901-a5523ccfd389\") " pod="openshift-image-registry/node-ca-59fkr" Oct 11 03:55:37 crc kubenswrapper[4703]: I1011 03:55:37.904141 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m4jc5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8090d9aa-59c5-4c77-a4c0-94f2fa8d4426\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgh7c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgh7c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgh7c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgh7c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgh7c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgh7c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgh7c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgh7c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgh7c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T03:55:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-m4jc5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:37 crc kubenswrapper[4703]: I1011 03:55:37.914859 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:37 crc kubenswrapper[4703]: I1011 03:55:37.921313 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-59fkr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8ee03a3c-bf16-41a7-b901-a5523ccfd389\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:37Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:37Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nqf2q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T03:55:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-59fkr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:37 crc kubenswrapper[4703]: I1011 03:55:37.935705 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"899c5710-ccac-41a7-a51e-f13015a9b925\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d36cda809e465ae3cdbebd04ca1d0c880d982920b9e17e237e1c02da638b4b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa3ac63ddb093f123aeccd29dda14b9d42e3196818fe255e4315b2fd81e4d4bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://337d11e16f135a7996228291aa5aeb9dca3b0743c8d53ad358ad75a3f6ac604f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f1d0a644c02180fdd84bdf90488b6f2b54986bc290dc6ba0063e1ecb405800b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bdb20db3996ab197ac02919be77d320bf5f7ef288276b9f0788963b19a3d818\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1011 03:55:23.042098 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1011 03:55:23.055510 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1583286593/tls.crt::/tmp/serving-cert-1583286593/tls.key\\\\\\\"\\\\nI1011 03:55:28.420513 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1011 03:55:28.424215 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1011 03:55:28.424249 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1011 03:55:28.424289 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1011 03:55:28.424299 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1011 03:55:28.439370 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1011 03:55:28.439543 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1011 03:55:28.439608 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1011 03:55:28.439657 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1011 03:55:28.439755 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1011 03:55:28.439805 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1011 03:55:28.439852 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1011 03:55:28.439400 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1011 03:55:28.443344 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-11T03:55:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6665fdb421f1acf4f7a285ccb1adcdca9e7b6191f21ed8dd1ebfa21771a0cbc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbcccfdee4565b57a78cd594f8e68b9ee6f8afb6d329818c54058ef262ae1aa7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fbcccfdee4565b57a78cd594f8e68b9ee6f8afb6d329818c54058ef262ae1aa7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T03:55:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T03:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T03:55:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:37 crc kubenswrapper[4703]: I1011 03:55:37.948215 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6b7d5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"74f832fc-6791-47d6-a9b3-07d923e053dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f96945fd0f7582822795213d47459caf01faf1e4d2a2edb5112939558c438969\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8vscm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4f73efcf3bef36e6452025ae44770270a644dd02150c7a3e760ab13c1488d22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8vscm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T03:55:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6b7d5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:37 crc kubenswrapper[4703]: I1011 03:55:37.952653 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 03:55:37 crc kubenswrapper[4703]: I1011 03:55:37.952689 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 03:55:37 crc kubenswrapper[4703]: I1011 03:55:37.952701 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 03:55:37 crc kubenswrapper[4703]: I1011 03:55:37.952720 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 03:55:37 crc kubenswrapper[4703]: I1011 03:55:37.952733 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T03:55:37Z","lastTransitionTime":"2025-10-11T03:55:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 03:55:37 crc kubenswrapper[4703]: I1011 03:55:37.970537 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m4jc5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8090d9aa-59c5-4c77-a4c0-94f2fa8d4426\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgh7c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgh7c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgh7c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgh7c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgh7c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgh7c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgh7c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgh7c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://295892d24c4de9ffda31b2be4cb00fdf1a108088677abebd05b7ab98c0f722f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://295892d24c4de9ffda31b2be4cb00fdf1a108088677abebd05b7ab98c0f722f5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T03:55:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T03:55:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgh7c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T03:55:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-m4jc5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:37 crc kubenswrapper[4703]: I1011 03:55:37.984645 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7905d36e-5e82-411f-bbff-19fbde136cb3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c5b679c77d5f5c431108f51cdeecf410bd4f6f76c74c2580cd3c39898eec23e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8a0a051a04bd848843a5b40337140b947c17d8378b0efbcef7d507ecadcd40b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6798d26a9f6823db8508369a2228601bbce6ce6f60799acc543e3a9074321130\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4c454ab3f6ac3f4ddf483a0039e0ddcff628cc078dd736875b86becc2f51c0d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T03:55:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:37 crc kubenswrapper[4703]: I1011 03:55:37.997324 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:38 crc kubenswrapper[4703]: I1011 03:55:38.005576 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:38 crc kubenswrapper[4703]: I1011 03:55:38.014409 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pxxpm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf5ff297-df57-4520-a699-dd9f0d3eb7f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70b7c324a45c1721cf18751603441ea7e9c3f31d9286985ee1b27a838cdce1d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-974bn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T03:55:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pxxpm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:38 crc kubenswrapper[4703]: I1011 03:55:38.021842 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-59fkr" Oct 11 03:55:38 crc kubenswrapper[4703]: I1011 03:55:38.023070 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:38 crc kubenswrapper[4703]: I1011 03:55:38.032067 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:38 crc kubenswrapper[4703]: W1011 03:55:38.037220 4703 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8ee03a3c_bf16_41a7_b901_a5523ccfd389.slice/crio-3384abc3eb3552b3430294548b594df0f167704a1e2916cd377328e1fa19ed73 WatchSource:0}: Error finding container 3384abc3eb3552b3430294548b594df0f167704a1e2916cd377328e1fa19ed73: Status 404 returned error can't find the container with id 3384abc3eb3552b3430294548b594df0f167704a1e2916cd377328e1fa19ed73 Oct 11 03:55:38 crc kubenswrapper[4703]: I1011 03:55:38.043127 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:38 crc kubenswrapper[4703]: I1011 03:55:38.055641 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 03:55:38 crc kubenswrapper[4703]: I1011 03:55:38.055682 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 03:55:38 crc kubenswrapper[4703]: I1011 03:55:38.055691 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 03:55:38 crc kubenswrapper[4703]: I1011 03:55:38.055706 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 03:55:38 crc kubenswrapper[4703]: I1011 03:55:38.055746 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-q24vt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2f31724-195d-43c1-8048-188d3646ca61\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b9j4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62485da5c000bd8c6573a0f058d3fd2a015e6770f02b7f5421b06ca4397fbd1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62485da5c000bd8c6573a0f058d3fd2a015e6770f02b7f5421b06ca4397fbd1f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T03:55:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T03:55:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b9j4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b9j4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b9j4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b9j4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b9j4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b9j4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T03:55:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-q24vt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:38 crc kubenswrapper[4703]: I1011 03:55:38.055715 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T03:55:38Z","lastTransitionTime":"2025-10-11T03:55:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 03:55:38 crc kubenswrapper[4703]: I1011 03:55:38.070401 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vgqng" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b6de354-b085-4f66-ac6c-4eb6005aa965\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d87e8dad79964dbc6e834c6da67c25f96267dc752ed1f3c8f1608ae7629d60e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-59z7g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T03:55:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vgqng\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:38 crc kubenswrapper[4703]: I1011 03:55:38.158277 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 03:55:38 crc kubenswrapper[4703]: I1011 03:55:38.158301 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 03:55:38 crc kubenswrapper[4703]: I1011 03:55:38.158309 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 03:55:38 crc kubenswrapper[4703]: I1011 03:55:38.158321 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 03:55:38 crc kubenswrapper[4703]: I1011 03:55:38.158330 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T03:55:38Z","lastTransitionTime":"2025-10-11T03:55:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 03:55:38 crc kubenswrapper[4703]: I1011 03:55:38.260514 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 03:55:38 crc kubenswrapper[4703]: I1011 03:55:38.260556 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 03:55:38 crc kubenswrapper[4703]: I1011 03:55:38.260572 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 03:55:38 crc kubenswrapper[4703]: I1011 03:55:38.260594 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 03:55:38 crc kubenswrapper[4703]: I1011 03:55:38.260611 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T03:55:38Z","lastTransitionTime":"2025-10-11T03:55:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 03:55:38 crc kubenswrapper[4703]: I1011 03:55:38.363983 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 03:55:38 crc kubenswrapper[4703]: I1011 03:55:38.364051 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 03:55:38 crc kubenswrapper[4703]: I1011 03:55:38.364068 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 03:55:38 crc kubenswrapper[4703]: I1011 03:55:38.364091 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 03:55:38 crc kubenswrapper[4703]: I1011 03:55:38.364108 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T03:55:38Z","lastTransitionTime":"2025-10-11T03:55:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 03:55:38 crc kubenswrapper[4703]: I1011 03:55:38.467840 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 03:55:38 crc kubenswrapper[4703]: I1011 03:55:38.467913 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 03:55:38 crc kubenswrapper[4703]: I1011 03:55:38.467939 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 03:55:38 crc kubenswrapper[4703]: I1011 03:55:38.467969 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 03:55:38 crc kubenswrapper[4703]: I1011 03:55:38.467992 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T03:55:38Z","lastTransitionTime":"2025-10-11T03:55:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 03:55:38 crc kubenswrapper[4703]: I1011 03:55:38.533203 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 11 03:55:38 crc kubenswrapper[4703]: I1011 03:55:38.533352 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 11 03:55:38 crc kubenswrapper[4703]: E1011 03:55:38.533621 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 11 03:55:38 crc kubenswrapper[4703]: I1011 03:55:38.533666 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 11 03:55:38 crc kubenswrapper[4703]: E1011 03:55:38.533814 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 11 03:55:38 crc kubenswrapper[4703]: E1011 03:55:38.533945 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 11 03:55:38 crc kubenswrapper[4703]: I1011 03:55:38.570091 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 03:55:38 crc kubenswrapper[4703]: I1011 03:55:38.570148 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 03:55:38 crc kubenswrapper[4703]: I1011 03:55:38.570165 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 03:55:38 crc kubenswrapper[4703]: I1011 03:55:38.570186 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 03:55:38 crc kubenswrapper[4703]: I1011 03:55:38.570204 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T03:55:38Z","lastTransitionTime":"2025-10-11T03:55:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 03:55:38 crc kubenswrapper[4703]: I1011 03:55:38.673131 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 03:55:38 crc kubenswrapper[4703]: I1011 03:55:38.673195 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 03:55:38 crc kubenswrapper[4703]: I1011 03:55:38.673214 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 03:55:38 crc kubenswrapper[4703]: I1011 03:55:38.673239 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 03:55:38 crc kubenswrapper[4703]: I1011 03:55:38.673256 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T03:55:38Z","lastTransitionTime":"2025-10-11T03:55:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 03:55:38 crc kubenswrapper[4703]: I1011 03:55:38.763982 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-q24vt" event={"ID":"c2f31724-195d-43c1-8048-188d3646ca61","Type":"ContainerStarted","Data":"648c6689ca31c9d3d3468ca21c3942c0ec6951e3dca1aeff9b3b53d2b44d28d6"} Oct 11 03:55:38 crc kubenswrapper[4703]: I1011 03:55:38.766902 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-59fkr" event={"ID":"8ee03a3c-bf16-41a7-b901-a5523ccfd389","Type":"ContainerStarted","Data":"8bf9e1af0fe5df7fabd9969f50806c97dc1b058a60cc9158c9ca9e9afe157073"} Oct 11 03:55:38 crc kubenswrapper[4703]: I1011 03:55:38.766991 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-59fkr" event={"ID":"8ee03a3c-bf16-41a7-b901-a5523ccfd389","Type":"ContainerStarted","Data":"3384abc3eb3552b3430294548b594df0f167704a1e2916cd377328e1fa19ed73"} Oct 11 03:55:38 crc kubenswrapper[4703]: I1011 03:55:38.770795 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m4jc5" event={"ID":"8090d9aa-59c5-4c77-a4c0-94f2fa8d4426","Type":"ContainerStarted","Data":"5acab90dda80ca9a5f26446486eea0689403ad891b85d765af6a439181aa47f2"} Oct 11 03:55:38 crc kubenswrapper[4703]: I1011 03:55:38.770847 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m4jc5" event={"ID":"8090d9aa-59c5-4c77-a4c0-94f2fa8d4426","Type":"ContainerStarted","Data":"00e6bcaa58d0cbca2d738a616bb0a5bbf22d4563f2851f11e2c995b08a080789"} Oct 11 03:55:38 crc kubenswrapper[4703]: I1011 03:55:38.776288 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 03:55:38 crc kubenswrapper[4703]: I1011 03:55:38.776333 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 03:55:38 crc kubenswrapper[4703]: I1011 03:55:38.776345 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 03:55:38 crc kubenswrapper[4703]: I1011 03:55:38.776364 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 03:55:38 crc kubenswrapper[4703]: I1011 03:55:38.776376 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T03:55:38Z","lastTransitionTime":"2025-10-11T03:55:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 03:55:38 crc kubenswrapper[4703]: I1011 03:55:38.780747 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7905d36e-5e82-411f-bbff-19fbde136cb3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c5b679c77d5f5c431108f51cdeecf410bd4f6f76c74c2580cd3c39898eec23e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8a0a051a04bd848843a5b40337140b947c17d8378b0efbcef7d507ecadcd40b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6798d26a9f6823db8508369a2228601bbce6ce6f60799acc543e3a9074321130\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4c454ab3f6ac3f4ddf483a0039e0ddcff628cc078dd736875b86becc2f51c0d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T03:55:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:38 crc kubenswrapper[4703]: I1011 03:55:38.792020 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:38 crc kubenswrapper[4703]: I1011 03:55:38.803920 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:38 crc kubenswrapper[4703]: I1011 03:55:38.812332 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pxxpm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf5ff297-df57-4520-a699-dd9f0d3eb7f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70b7c324a45c1721cf18751603441ea7e9c3f31d9286985ee1b27a838cdce1d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-974bn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T03:55:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pxxpm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:38 crc kubenswrapper[4703]: I1011 03:55:38.821912 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:38 crc kubenswrapper[4703]: I1011 03:55:38.831073 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:38 crc kubenswrapper[4703]: I1011 03:55:38.844582 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:38 crc kubenswrapper[4703]: I1011 03:55:38.858602 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-q24vt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2f31724-195d-43c1-8048-188d3646ca61\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b9j4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62485da5c000bd8c6573a0f058d3fd2a015e6770f02b7f5421b06ca4397fbd1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62485da5c000bd8c6573a0f058d3fd2a015e6770f02b7f5421b06ca4397fbd1f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T03:55:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T03:55:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b9j4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://648c6689ca31c9d3d3468ca21c3942c0ec6951e3dca1aeff9b3b53d2b44d28d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b9j4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b9j4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b9j4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b9j4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b9j4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T03:55:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-q24vt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:38 crc kubenswrapper[4703]: I1011 03:55:38.872415 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vgqng" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b6de354-b085-4f66-ac6c-4eb6005aa965\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d87e8dad79964dbc6e834c6da67c25f96267dc752ed1f3c8f1608ae7629d60e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-59z7g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T03:55:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vgqng\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:38 crc kubenswrapper[4703]: I1011 03:55:38.878295 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 03:55:38 crc kubenswrapper[4703]: I1011 03:55:38.878328 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 03:55:38 crc kubenswrapper[4703]: I1011 03:55:38.878338 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 03:55:38 crc kubenswrapper[4703]: I1011 03:55:38.878356 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 03:55:38 crc kubenswrapper[4703]: I1011 03:55:38.878367 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T03:55:38Z","lastTransitionTime":"2025-10-11T03:55:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 03:55:38 crc kubenswrapper[4703]: I1011 03:55:38.885645 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:38 crc kubenswrapper[4703]: I1011 03:55:38.893164 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-59fkr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8ee03a3c-bf16-41a7-b901-a5523ccfd389\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:37Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:37Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nqf2q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T03:55:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-59fkr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:38 crc kubenswrapper[4703]: I1011 03:55:38.908746 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"899c5710-ccac-41a7-a51e-f13015a9b925\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d36cda809e465ae3cdbebd04ca1d0c880d982920b9e17e237e1c02da638b4b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa3ac63ddb093f123aeccd29dda14b9d42e3196818fe255e4315b2fd81e4d4bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://337d11e16f135a7996228291aa5aeb9dca3b0743c8d53ad358ad75a3f6ac604f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f1d0a644c02180fdd84bdf90488b6f2b54986bc290dc6ba0063e1ecb405800b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bdb20db3996ab197ac02919be77d320bf5f7ef288276b9f0788963b19a3d818\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1011 03:55:23.042098 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1011 03:55:23.055510 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1583286593/tls.crt::/tmp/serving-cert-1583286593/tls.key\\\\\\\"\\\\nI1011 03:55:28.420513 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1011 03:55:28.424215 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1011 03:55:28.424249 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1011 03:55:28.424289 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1011 03:55:28.424299 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1011 03:55:28.439370 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1011 03:55:28.439543 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1011 03:55:28.439608 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1011 03:55:28.439657 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1011 03:55:28.439755 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1011 03:55:28.439805 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1011 03:55:28.439852 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1011 03:55:28.439400 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1011 03:55:28.443344 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-11T03:55:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6665fdb421f1acf4f7a285ccb1adcdca9e7b6191f21ed8dd1ebfa21771a0cbc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbcccfdee4565b57a78cd594f8e68b9ee6f8afb6d329818c54058ef262ae1aa7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fbcccfdee4565b57a78cd594f8e68b9ee6f8afb6d329818c54058ef262ae1aa7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T03:55:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T03:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T03:55:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:38 crc kubenswrapper[4703]: I1011 03:55:38.920985 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6b7d5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"74f832fc-6791-47d6-a9b3-07d923e053dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f96945fd0f7582822795213d47459caf01faf1e4d2a2edb5112939558c438969\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8vscm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4f73efcf3bef36e6452025ae44770270a644dd02150c7a3e760ab13c1488d22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8vscm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T03:55:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6b7d5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:38 crc kubenswrapper[4703]: I1011 03:55:38.935635 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m4jc5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8090d9aa-59c5-4c77-a4c0-94f2fa8d4426\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgh7c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgh7c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgh7c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgh7c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgh7c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgh7c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgh7c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgh7c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://295892d24c4de9ffda31b2be4cb00fdf1a108088677abebd05b7ab98c0f722f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://295892d24c4de9ffda31b2be4cb00fdf1a108088677abebd05b7ab98c0f722f5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T03:55:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T03:55:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgh7c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T03:55:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-m4jc5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:38 crc kubenswrapper[4703]: I1011 03:55:38.944757 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pxxpm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf5ff297-df57-4520-a699-dd9f0d3eb7f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70b7c324a45c1721cf18751603441ea7e9c3f31d9286985ee1b27a838cdce1d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-974bn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T03:55:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pxxpm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:38 crc kubenswrapper[4703]: I1011 03:55:38.954518 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7905d36e-5e82-411f-bbff-19fbde136cb3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c5b679c77d5f5c431108f51cdeecf410bd4f6f76c74c2580cd3c39898eec23e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8a0a051a04bd848843a5b40337140b947c17d8378b0efbcef7d507ecadcd40b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6798d26a9f6823db8508369a2228601bbce6ce6f60799acc543e3a9074321130\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4c454ab3f6ac3f4ddf483a0039e0ddcff628cc078dd736875b86becc2f51c0d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T03:55:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:38 crc kubenswrapper[4703]: I1011 03:55:38.965107 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:38 crc kubenswrapper[4703]: I1011 03:55:38.974140 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:38 crc kubenswrapper[4703]: I1011 03:55:38.983113 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 03:55:38 crc kubenswrapper[4703]: I1011 03:55:38.983222 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 03:55:38 crc kubenswrapper[4703]: I1011 03:55:38.983288 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 03:55:38 crc kubenswrapper[4703]: I1011 03:55:38.983370 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 03:55:38 crc kubenswrapper[4703]: I1011 03:55:38.983423 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T03:55:38Z","lastTransitionTime":"2025-10-11T03:55:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 03:55:38 crc kubenswrapper[4703]: I1011 03:55:38.984924 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vgqng" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b6de354-b085-4f66-ac6c-4eb6005aa965\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d87e8dad79964dbc6e834c6da67c25f96267dc752ed1f3c8f1608ae7629d60e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-59z7g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T03:55:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vgqng\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:38 crc kubenswrapper[4703]: I1011 03:55:38.994121 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:39 crc kubenswrapper[4703]: I1011 03:55:39.004373 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:39 crc kubenswrapper[4703]: I1011 03:55:39.018909 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:39 crc kubenswrapper[4703]: I1011 03:55:39.030647 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-q24vt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2f31724-195d-43c1-8048-188d3646ca61\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b9j4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62485da5c000bd8c6573a0f058d3fd2a015e6770f02b7f5421b06ca4397fbd1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62485da5c000bd8c6573a0f058d3fd2a015e6770f02b7f5421b06ca4397fbd1f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T03:55:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T03:55:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b9j4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://648c6689ca31c9d3d3468ca21c3942c0ec6951e3dca1aeff9b3b53d2b44d28d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b9j4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b9j4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b9j4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b9j4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b9j4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T03:55:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-q24vt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:39 crc kubenswrapper[4703]: I1011 03:55:39.051776 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:39 crc kubenswrapper[4703]: I1011 03:55:39.076401 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-59fkr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8ee03a3c-bf16-41a7-b901-a5523ccfd389\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bf9e1af0fe5df7fabd9969f50806c97dc1b058a60cc9158c9ca9e9afe157073\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nqf2q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T03:55:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-59fkr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:39 crc kubenswrapper[4703]: I1011 03:55:39.092498 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 03:55:39 crc kubenswrapper[4703]: I1011 03:55:39.092559 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 03:55:39 crc kubenswrapper[4703]: I1011 03:55:39.092570 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 03:55:39 crc kubenswrapper[4703]: I1011 03:55:39.092608 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 03:55:39 crc kubenswrapper[4703]: I1011 03:55:39.092622 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T03:55:39Z","lastTransitionTime":"2025-10-11T03:55:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 03:55:39 crc kubenswrapper[4703]: I1011 03:55:39.116654 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"899c5710-ccac-41a7-a51e-f13015a9b925\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d36cda809e465ae3cdbebd04ca1d0c880d982920b9e17e237e1c02da638b4b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa3ac63ddb093f123aeccd29dda14b9d42e3196818fe255e4315b2fd81e4d4bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://337d11e16f135a7996228291aa5aeb9dca3b0743c8d53ad358ad75a3f6ac604f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f1d0a644c02180fdd84bdf90488b6f2b54986bc290dc6ba0063e1ecb405800b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bdb20db3996ab197ac02919be77d320bf5f7ef288276b9f0788963b19a3d818\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1011 03:55:23.042098 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1011 03:55:23.055510 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1583286593/tls.crt::/tmp/serving-cert-1583286593/tls.key\\\\\\\"\\\\nI1011 03:55:28.420513 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1011 03:55:28.424215 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1011 03:55:28.424249 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1011 03:55:28.424289 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1011 03:55:28.424299 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1011 03:55:28.439370 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1011 03:55:28.439543 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1011 03:55:28.439608 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1011 03:55:28.439657 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1011 03:55:28.439755 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1011 03:55:28.439805 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1011 03:55:28.439852 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1011 03:55:28.439400 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1011 03:55:28.443344 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-11T03:55:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6665fdb421f1acf4f7a285ccb1adcdca9e7b6191f21ed8dd1ebfa21771a0cbc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbcccfdee4565b57a78cd594f8e68b9ee6f8afb6d329818c54058ef262ae1aa7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fbcccfdee4565b57a78cd594f8e68b9ee6f8afb6d329818c54058ef262ae1aa7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T03:55:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T03:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T03:55:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:39 crc kubenswrapper[4703]: I1011 03:55:39.149213 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6b7d5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"74f832fc-6791-47d6-a9b3-07d923e053dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f96945fd0f7582822795213d47459caf01faf1e4d2a2edb5112939558c438969\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8vscm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4f73efcf3bef36e6452025ae44770270a644dd02150c7a3e760ab13c1488d22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8vscm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T03:55:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6b7d5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:39 crc kubenswrapper[4703]: I1011 03:55:39.169878 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m4jc5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8090d9aa-59c5-4c77-a4c0-94f2fa8d4426\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgh7c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgh7c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgh7c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgh7c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgh7c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgh7c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgh7c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgh7c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://295892d24c4de9ffda31b2be4cb00fdf1a108088677abebd05b7ab98c0f722f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://295892d24c4de9ffda31b2be4cb00fdf1a108088677abebd05b7ab98c0f722f5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T03:55:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T03:55:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgh7c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T03:55:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-m4jc5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:39 crc kubenswrapper[4703]: I1011 03:55:39.195575 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 03:55:39 crc kubenswrapper[4703]: I1011 03:55:39.195625 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 03:55:39 crc kubenswrapper[4703]: I1011 03:55:39.195638 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 03:55:39 crc kubenswrapper[4703]: I1011 03:55:39.195654 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 03:55:39 crc kubenswrapper[4703]: I1011 03:55:39.195665 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T03:55:39Z","lastTransitionTime":"2025-10-11T03:55:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 03:55:39 crc kubenswrapper[4703]: I1011 03:55:39.298509 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 03:55:39 crc kubenswrapper[4703]: I1011 03:55:39.299129 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 03:55:39 crc kubenswrapper[4703]: I1011 03:55:39.299142 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 03:55:39 crc kubenswrapper[4703]: I1011 03:55:39.299159 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 03:55:39 crc kubenswrapper[4703]: I1011 03:55:39.299173 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T03:55:39Z","lastTransitionTime":"2025-10-11T03:55:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 03:55:39 crc kubenswrapper[4703]: I1011 03:55:39.401804 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 03:55:39 crc kubenswrapper[4703]: I1011 03:55:39.401877 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 03:55:39 crc kubenswrapper[4703]: I1011 03:55:39.401899 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 03:55:39 crc kubenswrapper[4703]: I1011 03:55:39.401923 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 03:55:39 crc kubenswrapper[4703]: I1011 03:55:39.401940 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T03:55:39Z","lastTransitionTime":"2025-10-11T03:55:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 03:55:39 crc kubenswrapper[4703]: I1011 03:55:39.505402 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 03:55:39 crc kubenswrapper[4703]: I1011 03:55:39.505479 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 03:55:39 crc kubenswrapper[4703]: I1011 03:55:39.505490 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 03:55:39 crc kubenswrapper[4703]: I1011 03:55:39.505511 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 03:55:39 crc kubenswrapper[4703]: I1011 03:55:39.505524 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T03:55:39Z","lastTransitionTime":"2025-10-11T03:55:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 03:55:39 crc kubenswrapper[4703]: I1011 03:55:39.544202 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:39 crc kubenswrapper[4703]: I1011 03:55:39.554981 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:39 crc kubenswrapper[4703]: I1011 03:55:39.569455 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:39 crc kubenswrapper[4703]: I1011 03:55:39.592252 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-q24vt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2f31724-195d-43c1-8048-188d3646ca61\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b9j4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62485da5c000bd8c6573a0f058d3fd2a015e6770f02b7f5421b06ca4397fbd1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62485da5c000bd8c6573a0f058d3fd2a015e6770f02b7f5421b06ca4397fbd1f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T03:55:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T03:55:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b9j4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://648c6689ca31c9d3d3468ca21c3942c0ec6951e3dca1aeff9b3b53d2b44d28d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b9j4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b9j4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b9j4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b9j4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b9j4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T03:55:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-q24vt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:39 crc kubenswrapper[4703]: I1011 03:55:39.607196 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 03:55:39 crc kubenswrapper[4703]: I1011 03:55:39.607260 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 03:55:39 crc kubenswrapper[4703]: I1011 03:55:39.607282 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 03:55:39 crc kubenswrapper[4703]: I1011 03:55:39.607301 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 03:55:39 crc kubenswrapper[4703]: I1011 03:55:39.607314 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T03:55:39Z","lastTransitionTime":"2025-10-11T03:55:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 03:55:39 crc kubenswrapper[4703]: I1011 03:55:39.610187 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vgqng" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b6de354-b085-4f66-ac6c-4eb6005aa965\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d87e8dad79964dbc6e834c6da67c25f96267dc752ed1f3c8f1608ae7629d60e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-59z7g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T03:55:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vgqng\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:39 crc kubenswrapper[4703]: I1011 03:55:39.622120 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:39 crc kubenswrapper[4703]: I1011 03:55:39.632961 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-59fkr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8ee03a3c-bf16-41a7-b901-a5523ccfd389\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bf9e1af0fe5df7fabd9969f50806c97dc1b058a60cc9158c9ca9e9afe157073\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nqf2q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T03:55:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-59fkr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:39 crc kubenswrapper[4703]: I1011 03:55:39.647839 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"899c5710-ccac-41a7-a51e-f13015a9b925\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d36cda809e465ae3cdbebd04ca1d0c880d982920b9e17e237e1c02da638b4b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa3ac63ddb093f123aeccd29dda14b9d42e3196818fe255e4315b2fd81e4d4bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://337d11e16f135a7996228291aa5aeb9dca3b0743c8d53ad358ad75a3f6ac604f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f1d0a644c02180fdd84bdf90488b6f2b54986bc290dc6ba0063e1ecb405800b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bdb20db3996ab197ac02919be77d320bf5f7ef288276b9f0788963b19a3d818\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1011 03:55:23.042098 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1011 03:55:23.055510 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1583286593/tls.crt::/tmp/serving-cert-1583286593/tls.key\\\\\\\"\\\\nI1011 03:55:28.420513 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1011 03:55:28.424215 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1011 03:55:28.424249 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1011 03:55:28.424289 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1011 03:55:28.424299 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1011 03:55:28.439370 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1011 03:55:28.439543 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1011 03:55:28.439608 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1011 03:55:28.439657 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1011 03:55:28.439755 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1011 03:55:28.439805 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1011 03:55:28.439852 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1011 03:55:28.439400 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1011 03:55:28.443344 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-11T03:55:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6665fdb421f1acf4f7a285ccb1adcdca9e7b6191f21ed8dd1ebfa21771a0cbc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbcccfdee4565b57a78cd594f8e68b9ee6f8afb6d329818c54058ef262ae1aa7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fbcccfdee4565b57a78cd594f8e68b9ee6f8afb6d329818c54058ef262ae1aa7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T03:55:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T03:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T03:55:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:39 crc kubenswrapper[4703]: I1011 03:55:39.659764 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6b7d5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"74f832fc-6791-47d6-a9b3-07d923e053dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f96945fd0f7582822795213d47459caf01faf1e4d2a2edb5112939558c438969\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8vscm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4f73efcf3bef36e6452025ae44770270a644dd02150c7a3e760ab13c1488d22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8vscm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T03:55:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6b7d5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:39 crc kubenswrapper[4703]: I1011 03:55:39.683997 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m4jc5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8090d9aa-59c5-4c77-a4c0-94f2fa8d4426\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgh7c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgh7c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgh7c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgh7c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgh7c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgh7c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgh7c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgh7c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://295892d24c4de9ffda31b2be4cb00fdf1a108088677abebd05b7ab98c0f722f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://295892d24c4de9ffda31b2be4cb00fdf1a108088677abebd05b7ab98c0f722f5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T03:55:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T03:55:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgh7c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T03:55:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-m4jc5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:39 crc kubenswrapper[4703]: I1011 03:55:39.698854 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7905d36e-5e82-411f-bbff-19fbde136cb3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c5b679c77d5f5c431108f51cdeecf410bd4f6f76c74c2580cd3c39898eec23e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8a0a051a04bd848843a5b40337140b947c17d8378b0efbcef7d507ecadcd40b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6798d26a9f6823db8508369a2228601bbce6ce6f60799acc543e3a9074321130\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4c454ab3f6ac3f4ddf483a0039e0ddcff628cc078dd736875b86becc2f51c0d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T03:55:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:39 crc kubenswrapper[4703]: I1011 03:55:39.709952 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 03:55:39 crc kubenswrapper[4703]: I1011 03:55:39.709982 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 03:55:39 crc kubenswrapper[4703]: I1011 03:55:39.709990 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 03:55:39 crc kubenswrapper[4703]: I1011 03:55:39.710003 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 03:55:39 crc kubenswrapper[4703]: I1011 03:55:39.710014 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T03:55:39Z","lastTransitionTime":"2025-10-11T03:55:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 03:55:39 crc kubenswrapper[4703]: I1011 03:55:39.711862 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:39 crc kubenswrapper[4703]: I1011 03:55:39.724288 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:39 crc kubenswrapper[4703]: I1011 03:55:39.732745 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pxxpm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf5ff297-df57-4520-a699-dd9f0d3eb7f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70b7c324a45c1721cf18751603441ea7e9c3f31d9286985ee1b27a838cdce1d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-974bn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T03:55:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pxxpm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:39 crc kubenswrapper[4703]: I1011 03:55:39.776456 4703 generic.go:334] "Generic (PLEG): container finished" podID="c2f31724-195d-43c1-8048-188d3646ca61" containerID="648c6689ca31c9d3d3468ca21c3942c0ec6951e3dca1aeff9b3b53d2b44d28d6" exitCode=0 Oct 11 03:55:39 crc kubenswrapper[4703]: I1011 03:55:39.776589 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-q24vt" event={"ID":"c2f31724-195d-43c1-8048-188d3646ca61","Type":"ContainerDied","Data":"648c6689ca31c9d3d3468ca21c3942c0ec6951e3dca1aeff9b3b53d2b44d28d6"} Oct 11 03:55:39 crc kubenswrapper[4703]: I1011 03:55:39.781947 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m4jc5" event={"ID":"8090d9aa-59c5-4c77-a4c0-94f2fa8d4426","Type":"ContainerStarted","Data":"2c7f1264e75bed906a827c0eeedcf3120809c53ede363c06629d4219ed82aa42"} Oct 11 03:55:39 crc kubenswrapper[4703]: I1011 03:55:39.782016 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m4jc5" event={"ID":"8090d9aa-59c5-4c77-a4c0-94f2fa8d4426","Type":"ContainerStarted","Data":"f275d204a8d4c20a7e6b6e7897724b671daa662a53ec87bdef257935fe5fbb07"} Oct 11 03:55:39 crc kubenswrapper[4703]: I1011 03:55:39.782047 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m4jc5" event={"ID":"8090d9aa-59c5-4c77-a4c0-94f2fa8d4426","Type":"ContainerStarted","Data":"b95e578049157330f58c94e99a339d76cb71eb6672860a9c1364ae204fdc5dc6"} Oct 11 03:55:39 crc kubenswrapper[4703]: I1011 03:55:39.782077 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m4jc5" event={"ID":"8090d9aa-59c5-4c77-a4c0-94f2fa8d4426","Type":"ContainerStarted","Data":"cc87197c1fda9865f81c76cb123bdf355d8fea22b3b3c3c1100a3bc375481b8a"} Oct 11 03:55:39 crc kubenswrapper[4703]: I1011 03:55:39.796743 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:39 crc kubenswrapper[4703]: I1011 03:55:39.814049 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 03:55:39 crc kubenswrapper[4703]: I1011 03:55:39.814134 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 03:55:39 crc kubenswrapper[4703]: I1011 03:55:39.814159 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 03:55:39 crc kubenswrapper[4703]: I1011 03:55:39.814140 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:39 crc kubenswrapper[4703]: I1011 03:55:39.814191 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 03:55:39 crc kubenswrapper[4703]: I1011 03:55:39.814334 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T03:55:39Z","lastTransitionTime":"2025-10-11T03:55:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 03:55:39 crc kubenswrapper[4703]: I1011 03:55:39.826102 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:39 crc kubenswrapper[4703]: I1011 03:55:39.844012 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-q24vt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2f31724-195d-43c1-8048-188d3646ca61\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b9j4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62485da5c000bd8c6573a0f058d3fd2a015e6770f02b7f5421b06ca4397fbd1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62485da5c000bd8c6573a0f058d3fd2a015e6770f02b7f5421b06ca4397fbd1f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T03:55:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T03:55:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b9j4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://648c6689ca31c9d3d3468ca21c3942c0ec6951e3dca1aeff9b3b53d2b44d28d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://648c6689ca31c9d3d3468ca21c3942c0ec6951e3dca1aeff9b3b53d2b44d28d6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T03:55:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T03:55:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b9j4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b9j4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b9j4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b9j4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b9j4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T03:55:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-q24vt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:39 crc kubenswrapper[4703]: I1011 03:55:39.858862 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vgqng" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b6de354-b085-4f66-ac6c-4eb6005aa965\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d87e8dad79964dbc6e834c6da67c25f96267dc752ed1f3c8f1608ae7629d60e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-59z7g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T03:55:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vgqng\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:39 crc kubenswrapper[4703]: I1011 03:55:39.870175 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:39 crc kubenswrapper[4703]: I1011 03:55:39.881727 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-59fkr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8ee03a3c-bf16-41a7-b901-a5523ccfd389\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bf9e1af0fe5df7fabd9969f50806c97dc1b058a60cc9158c9ca9e9afe157073\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nqf2q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T03:55:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-59fkr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:39 crc kubenswrapper[4703]: I1011 03:55:39.898951 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"899c5710-ccac-41a7-a51e-f13015a9b925\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d36cda809e465ae3cdbebd04ca1d0c880d982920b9e17e237e1c02da638b4b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa3ac63ddb093f123aeccd29dda14b9d42e3196818fe255e4315b2fd81e4d4bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://337d11e16f135a7996228291aa5aeb9dca3b0743c8d53ad358ad75a3f6ac604f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f1d0a644c02180fdd84bdf90488b6f2b54986bc290dc6ba0063e1ecb405800b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bdb20db3996ab197ac02919be77d320bf5f7ef288276b9f0788963b19a3d818\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1011 03:55:23.042098 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1011 03:55:23.055510 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1583286593/tls.crt::/tmp/serving-cert-1583286593/tls.key\\\\\\\"\\\\nI1011 03:55:28.420513 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1011 03:55:28.424215 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1011 03:55:28.424249 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1011 03:55:28.424289 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1011 03:55:28.424299 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1011 03:55:28.439370 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1011 03:55:28.439543 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1011 03:55:28.439608 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1011 03:55:28.439657 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1011 03:55:28.439755 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1011 03:55:28.439805 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1011 03:55:28.439852 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1011 03:55:28.439400 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1011 03:55:28.443344 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-11T03:55:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6665fdb421f1acf4f7a285ccb1adcdca9e7b6191f21ed8dd1ebfa21771a0cbc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbcccfdee4565b57a78cd594f8e68b9ee6f8afb6d329818c54058ef262ae1aa7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fbcccfdee4565b57a78cd594f8e68b9ee6f8afb6d329818c54058ef262ae1aa7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T03:55:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T03:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T03:55:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:39 crc kubenswrapper[4703]: I1011 03:55:39.911750 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6b7d5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"74f832fc-6791-47d6-a9b3-07d923e053dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f96945fd0f7582822795213d47459caf01faf1e4d2a2edb5112939558c438969\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8vscm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4f73efcf3bef36e6452025ae44770270a644dd02150c7a3e760ab13c1488d22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8vscm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T03:55:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6b7d5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:39 crc kubenswrapper[4703]: I1011 03:55:39.918083 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 03:55:39 crc kubenswrapper[4703]: I1011 03:55:39.918145 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 03:55:39 crc kubenswrapper[4703]: I1011 03:55:39.918162 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 03:55:39 crc kubenswrapper[4703]: I1011 03:55:39.918188 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 03:55:39 crc kubenswrapper[4703]: I1011 03:55:39.918208 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T03:55:39Z","lastTransitionTime":"2025-10-11T03:55:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 03:55:39 crc kubenswrapper[4703]: I1011 03:55:39.942042 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m4jc5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8090d9aa-59c5-4c77-a4c0-94f2fa8d4426\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgh7c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgh7c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgh7c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgh7c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgh7c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgh7c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgh7c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgh7c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://295892d24c4de9ffda31b2be4cb00fdf1a108088677abebd05b7ab98c0f722f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://295892d24c4de9ffda31b2be4cb00fdf1a108088677abebd05b7ab98c0f722f5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T03:55:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T03:55:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgh7c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T03:55:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-m4jc5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:39 crc kubenswrapper[4703]: I1011 03:55:39.969185 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7905d36e-5e82-411f-bbff-19fbde136cb3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c5b679c77d5f5c431108f51cdeecf410bd4f6f76c74c2580cd3c39898eec23e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8a0a051a04bd848843a5b40337140b947c17d8378b0efbcef7d507ecadcd40b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6798d26a9f6823db8508369a2228601bbce6ce6f60799acc543e3a9074321130\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4c454ab3f6ac3f4ddf483a0039e0ddcff628cc078dd736875b86becc2f51c0d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T03:55:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:40 crc kubenswrapper[4703]: I1011 03:55:40.011213 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:40 crc kubenswrapper[4703]: I1011 03:55:40.022779 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 03:55:40 crc kubenswrapper[4703]: I1011 03:55:40.022833 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 03:55:40 crc kubenswrapper[4703]: I1011 03:55:40.022850 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 03:55:40 crc kubenswrapper[4703]: I1011 03:55:40.022874 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 03:55:40 crc kubenswrapper[4703]: I1011 03:55:40.022895 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T03:55:40Z","lastTransitionTime":"2025-10-11T03:55:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 03:55:40 crc kubenswrapper[4703]: I1011 03:55:40.054814 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:40 crc kubenswrapper[4703]: I1011 03:55:40.091335 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pxxpm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf5ff297-df57-4520-a699-dd9f0d3eb7f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70b7c324a45c1721cf18751603441ea7e9c3f31d9286985ee1b27a838cdce1d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-974bn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T03:55:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pxxpm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:40 crc kubenswrapper[4703]: I1011 03:55:40.126098 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 03:55:40 crc kubenswrapper[4703]: I1011 03:55:40.126136 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 03:55:40 crc kubenswrapper[4703]: I1011 03:55:40.126145 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 03:55:40 crc kubenswrapper[4703]: I1011 03:55:40.126160 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 03:55:40 crc kubenswrapper[4703]: I1011 03:55:40.126172 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T03:55:40Z","lastTransitionTime":"2025-10-11T03:55:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 03:55:40 crc kubenswrapper[4703]: I1011 03:55:40.228153 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 03:55:40 crc kubenswrapper[4703]: I1011 03:55:40.228207 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 03:55:40 crc kubenswrapper[4703]: I1011 03:55:40.228222 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 03:55:40 crc kubenswrapper[4703]: I1011 03:55:40.228243 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 03:55:40 crc kubenswrapper[4703]: I1011 03:55:40.228260 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T03:55:40Z","lastTransitionTime":"2025-10-11T03:55:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 03:55:40 crc kubenswrapper[4703]: I1011 03:55:40.331095 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 03:55:40 crc kubenswrapper[4703]: I1011 03:55:40.331183 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 03:55:40 crc kubenswrapper[4703]: I1011 03:55:40.331208 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 03:55:40 crc kubenswrapper[4703]: I1011 03:55:40.331240 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 03:55:40 crc kubenswrapper[4703]: I1011 03:55:40.331262 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T03:55:40Z","lastTransitionTime":"2025-10-11T03:55:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 03:55:40 crc kubenswrapper[4703]: I1011 03:55:40.434019 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 03:55:40 crc kubenswrapper[4703]: I1011 03:55:40.434075 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 03:55:40 crc kubenswrapper[4703]: I1011 03:55:40.434093 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 03:55:40 crc kubenswrapper[4703]: I1011 03:55:40.434117 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 03:55:40 crc kubenswrapper[4703]: I1011 03:55:40.434134 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T03:55:40Z","lastTransitionTime":"2025-10-11T03:55:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 03:55:40 crc kubenswrapper[4703]: I1011 03:55:40.533555 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 11 03:55:40 crc kubenswrapper[4703]: I1011 03:55:40.533587 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 11 03:55:40 crc kubenswrapper[4703]: I1011 03:55:40.533647 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 11 03:55:40 crc kubenswrapper[4703]: E1011 03:55:40.533754 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 11 03:55:40 crc kubenswrapper[4703]: E1011 03:55:40.533888 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 11 03:55:40 crc kubenswrapper[4703]: E1011 03:55:40.534004 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 11 03:55:40 crc kubenswrapper[4703]: I1011 03:55:40.536527 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 03:55:40 crc kubenswrapper[4703]: I1011 03:55:40.536573 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 03:55:40 crc kubenswrapper[4703]: I1011 03:55:40.536590 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 03:55:40 crc kubenswrapper[4703]: I1011 03:55:40.536614 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 03:55:40 crc kubenswrapper[4703]: I1011 03:55:40.536631 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T03:55:40Z","lastTransitionTime":"2025-10-11T03:55:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 03:55:40 crc kubenswrapper[4703]: I1011 03:55:40.639454 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 03:55:40 crc kubenswrapper[4703]: I1011 03:55:40.639566 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 03:55:40 crc kubenswrapper[4703]: I1011 03:55:40.639584 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 03:55:40 crc kubenswrapper[4703]: I1011 03:55:40.639609 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 03:55:40 crc kubenswrapper[4703]: I1011 03:55:40.639626 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T03:55:40Z","lastTransitionTime":"2025-10-11T03:55:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 03:55:40 crc kubenswrapper[4703]: I1011 03:55:40.742800 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 03:55:40 crc kubenswrapper[4703]: I1011 03:55:40.742866 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 03:55:40 crc kubenswrapper[4703]: I1011 03:55:40.742885 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 03:55:40 crc kubenswrapper[4703]: I1011 03:55:40.742911 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 03:55:40 crc kubenswrapper[4703]: I1011 03:55:40.742931 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T03:55:40Z","lastTransitionTime":"2025-10-11T03:55:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 03:55:40 crc kubenswrapper[4703]: I1011 03:55:40.790764 4703 generic.go:334] "Generic (PLEG): container finished" podID="c2f31724-195d-43c1-8048-188d3646ca61" containerID="8146030a1513c74faccf964a1d3cabf1f94a54960dca98e08ccdc2f6af693f7b" exitCode=0 Oct 11 03:55:40 crc kubenswrapper[4703]: I1011 03:55:40.790819 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-q24vt" event={"ID":"c2f31724-195d-43c1-8048-188d3646ca61","Type":"ContainerDied","Data":"8146030a1513c74faccf964a1d3cabf1f94a54960dca98e08ccdc2f6af693f7b"} Oct 11 03:55:40 crc kubenswrapper[4703]: I1011 03:55:40.813779 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:40 crc kubenswrapper[4703]: I1011 03:55:40.828101 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pxxpm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf5ff297-df57-4520-a699-dd9f0d3eb7f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70b7c324a45c1721cf18751603441ea7e9c3f31d9286985ee1b27a838cdce1d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-974bn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T03:55:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pxxpm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:40 crc kubenswrapper[4703]: I1011 03:55:40.844392 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7905d36e-5e82-411f-bbff-19fbde136cb3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c5b679c77d5f5c431108f51cdeecf410bd4f6f76c74c2580cd3c39898eec23e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8a0a051a04bd848843a5b40337140b947c17d8378b0efbcef7d507ecadcd40b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6798d26a9f6823db8508369a2228601bbce6ce6f60799acc543e3a9074321130\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4c454ab3f6ac3f4ddf483a0039e0ddcff628cc078dd736875b86becc2f51c0d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T03:55:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:40 crc kubenswrapper[4703]: I1011 03:55:40.847260 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 03:55:40 crc kubenswrapper[4703]: I1011 03:55:40.847304 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 03:55:40 crc kubenswrapper[4703]: I1011 03:55:40.847324 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 03:55:40 crc kubenswrapper[4703]: I1011 03:55:40.847347 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 03:55:40 crc kubenswrapper[4703]: I1011 03:55:40.847415 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T03:55:40Z","lastTransitionTime":"2025-10-11T03:55:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 03:55:40 crc kubenswrapper[4703]: I1011 03:55:40.861193 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:40 crc kubenswrapper[4703]: I1011 03:55:40.879300 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-q24vt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2f31724-195d-43c1-8048-188d3646ca61\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b9j4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62485da5c000bd8c6573a0f058d3fd2a015e6770f02b7f5421b06ca4397fbd1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62485da5c000bd8c6573a0f058d3fd2a015e6770f02b7f5421b06ca4397fbd1f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T03:55:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T03:55:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b9j4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://648c6689ca31c9d3d3468ca21c3942c0ec6951e3dca1aeff9b3b53d2b44d28d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://648c6689ca31c9d3d3468ca21c3942c0ec6951e3dca1aeff9b3b53d2b44d28d6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T03:55:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T03:55:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b9j4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8146030a1513c74faccf964a1d3cabf1f94a54960dca98e08ccdc2f6af693f7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8146030a1513c74faccf964a1d3cabf1f94a54960dca98e08ccdc2f6af693f7b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T03:55:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T03:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b9j4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b9j4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b9j4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b9j4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T03:55:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-q24vt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:40 crc kubenswrapper[4703]: I1011 03:55:40.896004 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vgqng" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b6de354-b085-4f66-ac6c-4eb6005aa965\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d87e8dad79964dbc6e834c6da67c25f96267dc752ed1f3c8f1608ae7629d60e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-59z7g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T03:55:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vgqng\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:40 crc kubenswrapper[4703]: I1011 03:55:40.908382 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:40 crc kubenswrapper[4703]: I1011 03:55:40.918139 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:40 crc kubenswrapper[4703]: I1011 03:55:40.929508 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:40 crc kubenswrapper[4703]: I1011 03:55:40.938250 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-59fkr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8ee03a3c-bf16-41a7-b901-a5523ccfd389\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bf9e1af0fe5df7fabd9969f50806c97dc1b058a60cc9158c9ca9e9afe157073\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nqf2q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T03:55:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-59fkr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:40 crc kubenswrapper[4703]: I1011 03:55:40.950569 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:40 crc kubenswrapper[4703]: I1011 03:55:40.951180 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 03:55:40 crc kubenswrapper[4703]: I1011 03:55:40.951222 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 03:55:40 crc kubenswrapper[4703]: I1011 03:55:40.951232 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 03:55:40 crc kubenswrapper[4703]: I1011 03:55:40.951248 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 03:55:40 crc kubenswrapper[4703]: I1011 03:55:40.951258 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T03:55:40Z","lastTransitionTime":"2025-10-11T03:55:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 03:55:40 crc kubenswrapper[4703]: I1011 03:55:40.963688 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"899c5710-ccac-41a7-a51e-f13015a9b925\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d36cda809e465ae3cdbebd04ca1d0c880d982920b9e17e237e1c02da638b4b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa3ac63ddb093f123aeccd29dda14b9d42e3196818fe255e4315b2fd81e4d4bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://337d11e16f135a7996228291aa5aeb9dca3b0743c8d53ad358ad75a3f6ac604f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f1d0a644c02180fdd84bdf90488b6f2b54986bc290dc6ba0063e1ecb405800b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bdb20db3996ab197ac02919be77d320bf5f7ef288276b9f0788963b19a3d818\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1011 03:55:23.042098 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1011 03:55:23.055510 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1583286593/tls.crt::/tmp/serving-cert-1583286593/tls.key\\\\\\\"\\\\nI1011 03:55:28.420513 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1011 03:55:28.424215 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1011 03:55:28.424249 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1011 03:55:28.424289 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1011 03:55:28.424299 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1011 03:55:28.439370 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1011 03:55:28.439543 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1011 03:55:28.439608 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1011 03:55:28.439657 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1011 03:55:28.439755 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1011 03:55:28.439805 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1011 03:55:28.439852 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1011 03:55:28.439400 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1011 03:55:28.443344 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-11T03:55:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6665fdb421f1acf4f7a285ccb1adcdca9e7b6191f21ed8dd1ebfa21771a0cbc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbcccfdee4565b57a78cd594f8e68b9ee6f8afb6d329818c54058ef262ae1aa7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fbcccfdee4565b57a78cd594f8e68b9ee6f8afb6d329818c54058ef262ae1aa7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T03:55:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T03:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T03:55:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:40 crc kubenswrapper[4703]: I1011 03:55:40.975406 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6b7d5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"74f832fc-6791-47d6-a9b3-07d923e053dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f96945fd0f7582822795213d47459caf01faf1e4d2a2edb5112939558c438969\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8vscm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4f73efcf3bef36e6452025ae44770270a644dd02150c7a3e760ab13c1488d22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8vscm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T03:55:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6b7d5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:41 crc kubenswrapper[4703]: I1011 03:55:41.003030 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m4jc5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8090d9aa-59c5-4c77-a4c0-94f2fa8d4426\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgh7c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgh7c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgh7c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgh7c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgh7c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgh7c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgh7c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgh7c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://295892d24c4de9ffda31b2be4cb00fdf1a108088677abebd05b7ab98c0f722f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://295892d24c4de9ffda31b2be4cb00fdf1a108088677abebd05b7ab98c0f722f5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T03:55:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T03:55:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgh7c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T03:55:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-m4jc5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:41 crc kubenswrapper[4703]: I1011 03:55:41.055459 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 03:55:41 crc kubenswrapper[4703]: I1011 03:55:41.055540 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 03:55:41 crc kubenswrapper[4703]: I1011 03:55:41.055556 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 03:55:41 crc kubenswrapper[4703]: I1011 03:55:41.055578 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 03:55:41 crc kubenswrapper[4703]: I1011 03:55:41.055591 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T03:55:41Z","lastTransitionTime":"2025-10-11T03:55:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 03:55:41 crc kubenswrapper[4703]: I1011 03:55:41.159021 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 03:55:41 crc kubenswrapper[4703]: I1011 03:55:41.159077 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 03:55:41 crc kubenswrapper[4703]: I1011 03:55:41.159094 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 03:55:41 crc kubenswrapper[4703]: I1011 03:55:41.159118 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 03:55:41 crc kubenswrapper[4703]: I1011 03:55:41.159138 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T03:55:41Z","lastTransitionTime":"2025-10-11T03:55:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 03:55:41 crc kubenswrapper[4703]: I1011 03:55:41.262618 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 03:55:41 crc kubenswrapper[4703]: I1011 03:55:41.262693 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 03:55:41 crc kubenswrapper[4703]: I1011 03:55:41.262706 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 03:55:41 crc kubenswrapper[4703]: I1011 03:55:41.262730 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 03:55:41 crc kubenswrapper[4703]: I1011 03:55:41.262746 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T03:55:41Z","lastTransitionTime":"2025-10-11T03:55:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 03:55:41 crc kubenswrapper[4703]: I1011 03:55:41.368137 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 03:55:41 crc kubenswrapper[4703]: I1011 03:55:41.368194 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 03:55:41 crc kubenswrapper[4703]: I1011 03:55:41.368212 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 03:55:41 crc kubenswrapper[4703]: I1011 03:55:41.368239 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 03:55:41 crc kubenswrapper[4703]: I1011 03:55:41.368261 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T03:55:41Z","lastTransitionTime":"2025-10-11T03:55:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 03:55:41 crc kubenswrapper[4703]: I1011 03:55:41.470799 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 03:55:41 crc kubenswrapper[4703]: I1011 03:55:41.471142 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 03:55:41 crc kubenswrapper[4703]: I1011 03:55:41.471155 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 03:55:41 crc kubenswrapper[4703]: I1011 03:55:41.471175 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 03:55:41 crc kubenswrapper[4703]: I1011 03:55:41.471192 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T03:55:41Z","lastTransitionTime":"2025-10-11T03:55:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 03:55:41 crc kubenswrapper[4703]: I1011 03:55:41.574334 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 03:55:41 crc kubenswrapper[4703]: I1011 03:55:41.574374 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 03:55:41 crc kubenswrapper[4703]: I1011 03:55:41.574387 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 03:55:41 crc kubenswrapper[4703]: I1011 03:55:41.574408 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 03:55:41 crc kubenswrapper[4703]: I1011 03:55:41.574420 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T03:55:41Z","lastTransitionTime":"2025-10-11T03:55:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 03:55:41 crc kubenswrapper[4703]: I1011 03:55:41.677885 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 03:55:41 crc kubenswrapper[4703]: I1011 03:55:41.677947 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 03:55:41 crc kubenswrapper[4703]: I1011 03:55:41.677964 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 03:55:41 crc kubenswrapper[4703]: I1011 03:55:41.677990 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 03:55:41 crc kubenswrapper[4703]: I1011 03:55:41.678008 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T03:55:41Z","lastTransitionTime":"2025-10-11T03:55:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 03:55:41 crc kubenswrapper[4703]: I1011 03:55:41.782655 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 03:55:41 crc kubenswrapper[4703]: I1011 03:55:41.782763 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 03:55:41 crc kubenswrapper[4703]: I1011 03:55:41.782789 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 03:55:41 crc kubenswrapper[4703]: I1011 03:55:41.782863 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 03:55:41 crc kubenswrapper[4703]: I1011 03:55:41.782885 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T03:55:41Z","lastTransitionTime":"2025-10-11T03:55:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 03:55:41 crc kubenswrapper[4703]: I1011 03:55:41.801439 4703 generic.go:334] "Generic (PLEG): container finished" podID="c2f31724-195d-43c1-8048-188d3646ca61" containerID="154d3303c5331d6e83a4e49d822a847e82d52e36f66ca374cfbae71ca5bafb06" exitCode=0 Oct 11 03:55:41 crc kubenswrapper[4703]: I1011 03:55:41.801533 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-q24vt" event={"ID":"c2f31724-195d-43c1-8048-188d3646ca61","Type":"ContainerDied","Data":"154d3303c5331d6e83a4e49d822a847e82d52e36f66ca374cfbae71ca5bafb06"} Oct 11 03:55:41 crc kubenswrapper[4703]: I1011 03:55:41.809429 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m4jc5" event={"ID":"8090d9aa-59c5-4c77-a4c0-94f2fa8d4426","Type":"ContainerStarted","Data":"7327b4c65a37d4d38fd797a79eada88617bce15e8ca9ea2145f1cd8b2e7a4ea5"} Oct 11 03:55:41 crc kubenswrapper[4703]: I1011 03:55:41.826129 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"899c5710-ccac-41a7-a51e-f13015a9b925\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d36cda809e465ae3cdbebd04ca1d0c880d982920b9e17e237e1c02da638b4b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa3ac63ddb093f123aeccd29dda14b9d42e3196818fe255e4315b2fd81e4d4bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://337d11e16f135a7996228291aa5aeb9dca3b0743c8d53ad358ad75a3f6ac604f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f1d0a644c02180fdd84bdf90488b6f2b54986bc290dc6ba0063e1ecb405800b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bdb20db3996ab197ac02919be77d320bf5f7ef288276b9f0788963b19a3d818\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1011 03:55:23.042098 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1011 03:55:23.055510 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1583286593/tls.crt::/tmp/serving-cert-1583286593/tls.key\\\\\\\"\\\\nI1011 03:55:28.420513 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1011 03:55:28.424215 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1011 03:55:28.424249 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1011 03:55:28.424289 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1011 03:55:28.424299 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1011 03:55:28.439370 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1011 03:55:28.439543 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1011 03:55:28.439608 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1011 03:55:28.439657 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1011 03:55:28.439755 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1011 03:55:28.439805 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1011 03:55:28.439852 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1011 03:55:28.439400 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1011 03:55:28.443344 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-11T03:55:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6665fdb421f1acf4f7a285ccb1adcdca9e7b6191f21ed8dd1ebfa21771a0cbc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbcccfdee4565b57a78cd594f8e68b9ee6f8afb6d329818c54058ef262ae1aa7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fbcccfdee4565b57a78cd594f8e68b9ee6f8afb6d329818c54058ef262ae1aa7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T03:55:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T03:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T03:55:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:41 crc kubenswrapper[4703]: I1011 03:55:41.846632 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6b7d5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"74f832fc-6791-47d6-a9b3-07d923e053dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f96945fd0f7582822795213d47459caf01faf1e4d2a2edb5112939558c438969\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8vscm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4f73efcf3bef36e6452025ae44770270a644dd02150c7a3e760ab13c1488d22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8vscm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T03:55:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6b7d5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:41 crc kubenswrapper[4703]: I1011 03:55:41.877502 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m4jc5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8090d9aa-59c5-4c77-a4c0-94f2fa8d4426\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgh7c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgh7c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgh7c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgh7c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgh7c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgh7c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgh7c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgh7c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://295892d24c4de9ffda31b2be4cb00fdf1a108088677abebd05b7ab98c0f722f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://295892d24c4de9ffda31b2be4cb00fdf1a108088677abebd05b7ab98c0f722f5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T03:55:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T03:55:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgh7c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T03:55:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-m4jc5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:41 crc kubenswrapper[4703]: I1011 03:55:41.886443 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 03:55:41 crc kubenswrapper[4703]: I1011 03:55:41.886527 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 03:55:41 crc kubenswrapper[4703]: I1011 03:55:41.886542 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 03:55:41 crc kubenswrapper[4703]: I1011 03:55:41.886563 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 03:55:41 crc kubenswrapper[4703]: I1011 03:55:41.886577 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T03:55:41Z","lastTransitionTime":"2025-10-11T03:55:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 03:55:41 crc kubenswrapper[4703]: I1011 03:55:41.892402 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:41 crc kubenswrapper[4703]: I1011 03:55:41.904015 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pxxpm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf5ff297-df57-4520-a699-dd9f0d3eb7f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70b7c324a45c1721cf18751603441ea7e9c3f31d9286985ee1b27a838cdce1d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-974bn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T03:55:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pxxpm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:41 crc kubenswrapper[4703]: I1011 03:55:41.918742 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7905d36e-5e82-411f-bbff-19fbde136cb3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c5b679c77d5f5c431108f51cdeecf410bd4f6f76c74c2580cd3c39898eec23e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8a0a051a04bd848843a5b40337140b947c17d8378b0efbcef7d507ecadcd40b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6798d26a9f6823db8508369a2228601bbce6ce6f60799acc543e3a9074321130\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4c454ab3f6ac3f4ddf483a0039e0ddcff628cc078dd736875b86becc2f51c0d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T03:55:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:41 crc kubenswrapper[4703]: I1011 03:55:41.930960 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:41 crc kubenswrapper[4703]: I1011 03:55:41.944491 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-q24vt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2f31724-195d-43c1-8048-188d3646ca61\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b9j4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62485da5c000bd8c6573a0f058d3fd2a015e6770f02b7f5421b06ca4397fbd1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62485da5c000bd8c6573a0f058d3fd2a015e6770f02b7f5421b06ca4397fbd1f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T03:55:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T03:55:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b9j4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://648c6689ca31c9d3d3468ca21c3942c0ec6951e3dca1aeff9b3b53d2b44d28d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://648c6689ca31c9d3d3468ca21c3942c0ec6951e3dca1aeff9b3b53d2b44d28d6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T03:55:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T03:55:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b9j4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8146030a1513c74faccf964a1d3cabf1f94a54960dca98e08ccdc2f6af693f7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8146030a1513c74faccf964a1d3cabf1f94a54960dca98e08ccdc2f6af693f7b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T03:55:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T03:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b9j4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://154d3303c5331d6e83a4e49d822a847e82d52e36f66ca374cfbae71ca5bafb06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://154d3303c5331d6e83a4e49d822a847e82d52e36f66ca374cfbae71ca5bafb06\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T03:55:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T03:55:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b9j4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b9j4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b9j4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T03:55:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-q24vt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:41 crc kubenswrapper[4703]: I1011 03:55:41.957419 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vgqng" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b6de354-b085-4f66-ac6c-4eb6005aa965\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d87e8dad79964dbc6e834c6da67c25f96267dc752ed1f3c8f1608ae7629d60e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-59z7g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T03:55:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vgqng\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:41 crc kubenswrapper[4703]: I1011 03:55:41.973892 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:41 crc kubenswrapper[4703]: I1011 03:55:41.985704 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:41 crc kubenswrapper[4703]: I1011 03:55:41.989539 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 03:55:41 crc kubenswrapper[4703]: I1011 03:55:41.989593 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 03:55:41 crc kubenswrapper[4703]: I1011 03:55:41.989606 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 03:55:41 crc kubenswrapper[4703]: I1011 03:55:41.989624 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 03:55:41 crc kubenswrapper[4703]: I1011 03:55:41.989636 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T03:55:41Z","lastTransitionTime":"2025-10-11T03:55:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 03:55:41 crc kubenswrapper[4703]: I1011 03:55:41.998458 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:42 crc kubenswrapper[4703]: I1011 03:55:42.007295 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-59fkr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8ee03a3c-bf16-41a7-b901-a5523ccfd389\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bf9e1af0fe5df7fabd9969f50806c97dc1b058a60cc9158c9ca9e9afe157073\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nqf2q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T03:55:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-59fkr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:42 crc kubenswrapper[4703]: I1011 03:55:42.020738 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:42 crc kubenswrapper[4703]: I1011 03:55:42.092179 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 03:55:42 crc kubenswrapper[4703]: I1011 03:55:42.092250 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 03:55:42 crc kubenswrapper[4703]: I1011 03:55:42.092260 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 03:55:42 crc kubenswrapper[4703]: I1011 03:55:42.092282 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 03:55:42 crc kubenswrapper[4703]: I1011 03:55:42.092309 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T03:55:42Z","lastTransitionTime":"2025-10-11T03:55:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 03:55:42 crc kubenswrapper[4703]: I1011 03:55:42.201437 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 03:55:42 crc kubenswrapper[4703]: I1011 03:55:42.201578 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 03:55:42 crc kubenswrapper[4703]: I1011 03:55:42.201607 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 03:55:42 crc kubenswrapper[4703]: I1011 03:55:42.201639 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 03:55:42 crc kubenswrapper[4703]: I1011 03:55:42.201663 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T03:55:42Z","lastTransitionTime":"2025-10-11T03:55:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 03:55:42 crc kubenswrapper[4703]: I1011 03:55:42.305058 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 03:55:42 crc kubenswrapper[4703]: I1011 03:55:42.305121 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 03:55:42 crc kubenswrapper[4703]: I1011 03:55:42.305142 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 03:55:42 crc kubenswrapper[4703]: I1011 03:55:42.305168 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 03:55:42 crc kubenswrapper[4703]: I1011 03:55:42.305186 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T03:55:42Z","lastTransitionTime":"2025-10-11T03:55:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 03:55:42 crc kubenswrapper[4703]: I1011 03:55:42.407950 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 03:55:42 crc kubenswrapper[4703]: I1011 03:55:42.408019 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 03:55:42 crc kubenswrapper[4703]: I1011 03:55:42.408037 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 03:55:42 crc kubenswrapper[4703]: I1011 03:55:42.408063 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 03:55:42 crc kubenswrapper[4703]: I1011 03:55:42.408081 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T03:55:42Z","lastTransitionTime":"2025-10-11T03:55:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 03:55:42 crc kubenswrapper[4703]: I1011 03:55:42.511504 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 03:55:42 crc kubenswrapper[4703]: I1011 03:55:42.511580 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 03:55:42 crc kubenswrapper[4703]: I1011 03:55:42.511598 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 03:55:42 crc kubenswrapper[4703]: I1011 03:55:42.511626 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 03:55:42 crc kubenswrapper[4703]: I1011 03:55:42.511648 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T03:55:42Z","lastTransitionTime":"2025-10-11T03:55:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 03:55:42 crc kubenswrapper[4703]: I1011 03:55:42.533575 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 11 03:55:42 crc kubenswrapper[4703]: I1011 03:55:42.533958 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 11 03:55:42 crc kubenswrapper[4703]: E1011 03:55:42.534390 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 11 03:55:42 crc kubenswrapper[4703]: E1011 03:55:42.534562 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 11 03:55:42 crc kubenswrapper[4703]: I1011 03:55:42.535680 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 11 03:55:42 crc kubenswrapper[4703]: E1011 03:55:42.535904 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 11 03:55:42 crc kubenswrapper[4703]: I1011 03:55:42.615056 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 03:55:42 crc kubenswrapper[4703]: I1011 03:55:42.615123 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 03:55:42 crc kubenswrapper[4703]: I1011 03:55:42.615142 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 03:55:42 crc kubenswrapper[4703]: I1011 03:55:42.615169 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 03:55:42 crc kubenswrapper[4703]: I1011 03:55:42.615189 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T03:55:42Z","lastTransitionTime":"2025-10-11T03:55:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 03:55:42 crc kubenswrapper[4703]: I1011 03:55:42.717913 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 03:55:42 crc kubenswrapper[4703]: I1011 03:55:42.717943 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 03:55:42 crc kubenswrapper[4703]: I1011 03:55:42.717952 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 03:55:42 crc kubenswrapper[4703]: I1011 03:55:42.717966 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 03:55:42 crc kubenswrapper[4703]: I1011 03:55:42.717976 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T03:55:42Z","lastTransitionTime":"2025-10-11T03:55:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 03:55:42 crc kubenswrapper[4703]: I1011 03:55:42.819152 4703 generic.go:334] "Generic (PLEG): container finished" podID="c2f31724-195d-43c1-8048-188d3646ca61" containerID="4872a255883fda0c061cf1a430ef4d6de2bd3465540433734a6c76038f0126f1" exitCode=0 Oct 11 03:55:42 crc kubenswrapper[4703]: I1011 03:55:42.819207 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-q24vt" event={"ID":"c2f31724-195d-43c1-8048-188d3646ca61","Type":"ContainerDied","Data":"4872a255883fda0c061cf1a430ef4d6de2bd3465540433734a6c76038f0126f1"} Oct 11 03:55:42 crc kubenswrapper[4703]: I1011 03:55:42.820044 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 03:55:42 crc kubenswrapper[4703]: I1011 03:55:42.820075 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 03:55:42 crc kubenswrapper[4703]: I1011 03:55:42.820091 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 03:55:42 crc kubenswrapper[4703]: I1011 03:55:42.820114 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 03:55:42 crc kubenswrapper[4703]: I1011 03:55:42.820131 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T03:55:42Z","lastTransitionTime":"2025-10-11T03:55:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 03:55:42 crc kubenswrapper[4703]: I1011 03:55:42.822727 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"b22698b58047065f355a20c87e6a70f91d921ac2f8a4f651274eaa153e7f689d"} Oct 11 03:55:42 crc kubenswrapper[4703]: I1011 03:55:42.838029 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vgqng" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b6de354-b085-4f66-ac6c-4eb6005aa965\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d87e8dad79964dbc6e834c6da67c25f96267dc752ed1f3c8f1608ae7629d60e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-59z7g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T03:55:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vgqng\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:42 crc kubenswrapper[4703]: I1011 03:55:42.851340 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:42 crc kubenswrapper[4703]: I1011 03:55:42.867172 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:42 crc kubenswrapper[4703]: I1011 03:55:42.882077 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:42 crc kubenswrapper[4703]: I1011 03:55:42.896913 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-q24vt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2f31724-195d-43c1-8048-188d3646ca61\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b9j4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62485da5c000bd8c6573a0f058d3fd2a015e6770f02b7f5421b06ca4397fbd1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62485da5c000bd8c6573a0f058d3fd2a015e6770f02b7f5421b06ca4397fbd1f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T03:55:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T03:55:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b9j4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://648c6689ca31c9d3d3468ca21c3942c0ec6951e3dca1aeff9b3b53d2b44d28d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://648c6689ca31c9d3d3468ca21c3942c0ec6951e3dca1aeff9b3b53d2b44d28d6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T03:55:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T03:55:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b9j4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8146030a1513c74faccf964a1d3cabf1f94a54960dca98e08ccdc2f6af693f7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8146030a1513c74faccf964a1d3cabf1f94a54960dca98e08ccdc2f6af693f7b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T03:55:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T03:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b9j4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://154d3303c5331d6e83a4e49d822a847e82d52e36f66ca374cfbae71ca5bafb06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://154d3303c5331d6e83a4e49d822a847e82d52e36f66ca374cfbae71ca5bafb06\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T03:55:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T03:55:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b9j4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4872a255883fda0c061cf1a430ef4d6de2bd3465540433734a6c76038f0126f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4872a255883fda0c061cf1a430ef4d6de2bd3465540433734a6c76038f0126f1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T03:55:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T03:55:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b9j4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b9j4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T03:55:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-q24vt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:42 crc kubenswrapper[4703]: I1011 03:55:42.909104 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:42 crc kubenswrapper[4703]: I1011 03:55:42.918090 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-59fkr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8ee03a3c-bf16-41a7-b901-a5523ccfd389\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bf9e1af0fe5df7fabd9969f50806c97dc1b058a60cc9158c9ca9e9afe157073\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nqf2q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T03:55:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-59fkr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:42 crc kubenswrapper[4703]: I1011 03:55:42.924780 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 03:55:42 crc kubenswrapper[4703]: I1011 03:55:42.924812 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 03:55:42 crc kubenswrapper[4703]: I1011 03:55:42.924820 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 03:55:42 crc kubenswrapper[4703]: I1011 03:55:42.924834 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 03:55:42 crc kubenswrapper[4703]: I1011 03:55:42.924847 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T03:55:42Z","lastTransitionTime":"2025-10-11T03:55:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 03:55:42 crc kubenswrapper[4703]: I1011 03:55:42.932538 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"899c5710-ccac-41a7-a51e-f13015a9b925\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d36cda809e465ae3cdbebd04ca1d0c880d982920b9e17e237e1c02da638b4b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa3ac63ddb093f123aeccd29dda14b9d42e3196818fe255e4315b2fd81e4d4bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://337d11e16f135a7996228291aa5aeb9dca3b0743c8d53ad358ad75a3f6ac604f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f1d0a644c02180fdd84bdf90488b6f2b54986bc290dc6ba0063e1ecb405800b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bdb20db3996ab197ac02919be77d320bf5f7ef288276b9f0788963b19a3d818\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1011 03:55:23.042098 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1011 03:55:23.055510 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1583286593/tls.crt::/tmp/serving-cert-1583286593/tls.key\\\\\\\"\\\\nI1011 03:55:28.420513 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1011 03:55:28.424215 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1011 03:55:28.424249 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1011 03:55:28.424289 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1011 03:55:28.424299 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1011 03:55:28.439370 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1011 03:55:28.439543 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1011 03:55:28.439608 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1011 03:55:28.439657 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1011 03:55:28.439755 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1011 03:55:28.439805 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1011 03:55:28.439852 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1011 03:55:28.439400 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1011 03:55:28.443344 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-11T03:55:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6665fdb421f1acf4f7a285ccb1adcdca9e7b6191f21ed8dd1ebfa21771a0cbc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbcccfdee4565b57a78cd594f8e68b9ee6f8afb6d329818c54058ef262ae1aa7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fbcccfdee4565b57a78cd594f8e68b9ee6f8afb6d329818c54058ef262ae1aa7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T03:55:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T03:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T03:55:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:42 crc kubenswrapper[4703]: I1011 03:55:42.942943 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6b7d5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"74f832fc-6791-47d6-a9b3-07d923e053dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f96945fd0f7582822795213d47459caf01faf1e4d2a2edb5112939558c438969\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8vscm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4f73efcf3bef36e6452025ae44770270a644dd02150c7a3e760ab13c1488d22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8vscm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T03:55:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6b7d5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:42 crc kubenswrapper[4703]: I1011 03:55:42.964809 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m4jc5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8090d9aa-59c5-4c77-a4c0-94f2fa8d4426\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgh7c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgh7c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgh7c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgh7c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgh7c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgh7c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgh7c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgh7c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://295892d24c4de9ffda31b2be4cb00fdf1a108088677abebd05b7ab98c0f722f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://295892d24c4de9ffda31b2be4cb00fdf1a108088677abebd05b7ab98c0f722f5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T03:55:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T03:55:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgh7c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T03:55:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-m4jc5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:42 crc kubenswrapper[4703]: I1011 03:55:42.973878 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pxxpm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf5ff297-df57-4520-a699-dd9f0d3eb7f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70b7c324a45c1721cf18751603441ea7e9c3f31d9286985ee1b27a838cdce1d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-974bn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T03:55:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pxxpm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:42 crc kubenswrapper[4703]: I1011 03:55:42.984326 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7905d36e-5e82-411f-bbff-19fbde136cb3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c5b679c77d5f5c431108f51cdeecf410bd4f6f76c74c2580cd3c39898eec23e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8a0a051a04bd848843a5b40337140b947c17d8378b0efbcef7d507ecadcd40b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6798d26a9f6823db8508369a2228601bbce6ce6f60799acc543e3a9074321130\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4c454ab3f6ac3f4ddf483a0039e0ddcff628cc078dd736875b86becc2f51c0d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T03:55:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:42 crc kubenswrapper[4703]: I1011 03:55:42.994823 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:43 crc kubenswrapper[4703]: I1011 03:55:43.003114 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 03:55:43 crc kubenswrapper[4703]: I1011 03:55:43.027020 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 03:55:43 crc kubenswrapper[4703]: I1011 03:55:43.027057 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 03:55:43 crc kubenswrapper[4703]: I1011 03:55:43.027067 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 03:55:43 crc kubenswrapper[4703]: I1011 03:55:43.027081 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 03:55:43 crc kubenswrapper[4703]: I1011 03:55:43.027090 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T03:55:43Z","lastTransitionTime":"2025-10-11T03:55:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 03:55:43 crc kubenswrapper[4703]: I1011 03:55:43.131863 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 03:55:43 crc kubenswrapper[4703]: I1011 03:55:43.131901 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 03:55:43 crc kubenswrapper[4703]: I1011 03:55:43.131911 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 03:55:43 crc kubenswrapper[4703]: I1011 03:55:43.131926 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 03:55:43 crc kubenswrapper[4703]: I1011 03:55:43.131934 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T03:55:43Z","lastTransitionTime":"2025-10-11T03:55:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 03:55:43 crc kubenswrapper[4703]: I1011 03:55:43.233936 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 03:55:43 crc kubenswrapper[4703]: I1011 03:55:43.234303 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 03:55:43 crc kubenswrapper[4703]: I1011 03:55:43.234423 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 03:55:43 crc kubenswrapper[4703]: I1011 03:55:43.234556 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 03:55:43 crc kubenswrapper[4703]: I1011 03:55:43.234679 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T03:55:43Z","lastTransitionTime":"2025-10-11T03:55:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 03:55:43 crc kubenswrapper[4703]: I1011 03:55:43.337532 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 03:55:43 crc kubenswrapper[4703]: I1011 03:55:43.337568 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 03:55:43 crc kubenswrapper[4703]: I1011 03:55:43.337580 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 03:55:43 crc kubenswrapper[4703]: I1011 03:55:43.337595 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 03:55:43 crc kubenswrapper[4703]: I1011 03:55:43.337604 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T03:55:43Z","lastTransitionTime":"2025-10-11T03:55:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 03:55:43 crc kubenswrapper[4703]: I1011 03:55:43.439658 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 03:55:43 crc kubenswrapper[4703]: I1011 03:55:43.440106 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 03:55:43 crc kubenswrapper[4703]: I1011 03:55:43.440281 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 03:55:43 crc kubenswrapper[4703]: I1011 03:55:43.440426 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 03:55:43 crc kubenswrapper[4703]: I1011 03:55:43.440611 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T03:55:43Z","lastTransitionTime":"2025-10-11T03:55:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 03:55:43 crc kubenswrapper[4703]: I1011 03:55:43.542973 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 03:55:43 crc kubenswrapper[4703]: I1011 03:55:43.543035 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 03:55:43 crc kubenswrapper[4703]: I1011 03:55:43.543044 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 03:55:43 crc kubenswrapper[4703]: I1011 03:55:43.543055 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 03:55:43 crc kubenswrapper[4703]: I1011 03:55:43.543064 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T03:55:43Z","lastTransitionTime":"2025-10-11T03:55:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 03:55:43 crc kubenswrapper[4703]: I1011 03:55:43.645703 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 03:55:43 crc kubenswrapper[4703]: I1011 03:55:43.646027 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 03:55:43 crc kubenswrapper[4703]: I1011 03:55:43.646045 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 03:55:43 crc kubenswrapper[4703]: I1011 03:55:43.646065 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 03:55:43 crc kubenswrapper[4703]: I1011 03:55:43.646080 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T03:55:43Z","lastTransitionTime":"2025-10-11T03:55:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 03:55:43 crc kubenswrapper[4703]: I1011 03:55:43.749100 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 03:55:43 crc kubenswrapper[4703]: I1011 03:55:43.749140 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 03:55:43 crc kubenswrapper[4703]: I1011 03:55:43.749156 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 03:55:43 crc kubenswrapper[4703]: I1011 03:55:43.749179 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 03:55:43 crc kubenswrapper[4703]: I1011 03:55:43.749198 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T03:55:43Z","lastTransitionTime":"2025-10-11T03:55:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 03:55:43 crc kubenswrapper[4703]: I1011 03:55:43.834636 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m4jc5" event={"ID":"8090d9aa-59c5-4c77-a4c0-94f2fa8d4426","Type":"ContainerStarted","Data":"de794e2f163f11d0639196abf1633be939f13e7a6925907982886f60f0b6decb"} Oct 11 03:55:43 crc kubenswrapper[4703]: I1011 03:55:43.834731 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-m4jc5" Oct 11 03:55:43 crc kubenswrapper[4703]: I1011 03:55:43.834904 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-m4jc5" Oct 11 03:55:43 crc kubenswrapper[4703]: I1011 03:55:43.841586 4703 generic.go:334] "Generic (PLEG): container finished" podID="c2f31724-195d-43c1-8048-188d3646ca61" containerID="28d55d83b423ac444f6edbee0712cc90b10c9b714407b03f5d9a799be53a7cb5" exitCode=0 Oct 11 03:55:43 crc kubenswrapper[4703]: I1011 03:55:43.841701 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-q24vt" event={"ID":"c2f31724-195d-43c1-8048-188d3646ca61","Type":"ContainerDied","Data":"28d55d83b423ac444f6edbee0712cc90b10c9b714407b03f5d9a799be53a7cb5"} Oct 11 03:55:43 crc kubenswrapper[4703]: I1011 03:55:43.851170 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"7312548a8aa43507e9c61cef8dc71d23e0cbc965b92c1c4c1fa578b294612f07"} Oct 11 03:55:43 crc kubenswrapper[4703]: I1011 03:55:43.851972 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 03:55:43 crc kubenswrapper[4703]: I1011 03:55:43.852019 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 03:55:43 crc kubenswrapper[4703]: I1011 03:55:43.852035 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 03:55:43 crc kubenswrapper[4703]: I1011 03:55:43.852057 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 03:55:43 crc kubenswrapper[4703]: I1011 03:55:43.852074 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T03:55:43Z","lastTransitionTime":"2025-10-11T03:55:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 03:55:43 crc kubenswrapper[4703]: I1011 03:55:43.853170 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"710af3ad23aecd87b74b285a5ac1a0263bb9ddb18f7f9032ecb512625b24c7bb"} Oct 11 03:55:43 crc kubenswrapper[4703]: I1011 03:55:43.856383 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7905d36e-5e82-411f-bbff-19fbde136cb3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c5b679c77d5f5c431108f51cdeecf410bd4f6f76c74c2580cd3c39898eec23e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8a0a051a04bd848843a5b40337140b947c17d8378b0efbcef7d507ecadcd40b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6798d26a9f6823db8508369a2228601bbce6ce6f60799acc543e3a9074321130\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4c454ab3f6ac3f4ddf483a0039e0ddcff628cc078dd736875b86becc2f51c0d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T03:55:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T03:55:43Z is after 2025-08-24T17:21:41Z" Oct 11 03:55:43 crc kubenswrapper[4703]: I1011 03:55:43.870373 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T03:55:43Z is after 2025-08-24T17:21:41Z" Oct 11 03:55:43 crc kubenswrapper[4703]: I1011 03:55:43.881657 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-m4jc5" Oct 11 03:55:43 crc kubenswrapper[4703]: I1011 03:55:43.884611 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-m4jc5" Oct 11 03:55:43 crc kubenswrapper[4703]: I1011 03:55:43.892567 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T03:55:43Z is after 2025-08-24T17:21:41Z" Oct 11 03:55:43 crc kubenswrapper[4703]: I1011 03:55:43.909887 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pxxpm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf5ff297-df57-4520-a699-dd9f0d3eb7f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70b7c324a45c1721cf18751603441ea7e9c3f31d9286985ee1b27a838cdce1d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-974bn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T03:55:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pxxpm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T03:55:43Z is after 2025-08-24T17:21:41Z" Oct 11 03:55:43 crc kubenswrapper[4703]: I1011 03:55:43.926508 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T03:55:43Z is after 2025-08-24T17:21:41Z" Oct 11 03:55:43 crc kubenswrapper[4703]: I1011 03:55:43.946886 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T03:55:43Z is after 2025-08-24T17:21:41Z" Oct 11 03:55:43 crc kubenswrapper[4703]: I1011 03:55:43.955806 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 03:55:43 crc kubenswrapper[4703]: I1011 03:55:43.955850 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 03:55:43 crc kubenswrapper[4703]: I1011 03:55:43.955860 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 03:55:43 crc kubenswrapper[4703]: I1011 03:55:43.955875 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 03:55:43 crc kubenswrapper[4703]: I1011 03:55:43.955884 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T03:55:43Z","lastTransitionTime":"2025-10-11T03:55:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 03:55:43 crc kubenswrapper[4703]: I1011 03:55:43.963185 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T03:55:43Z is after 2025-08-24T17:21:41Z" Oct 11 03:55:43 crc kubenswrapper[4703]: I1011 03:55:43.983166 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-q24vt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2f31724-195d-43c1-8048-188d3646ca61\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b9j4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62485da5c000bd8c6573a0f058d3fd2a015e6770f02b7f5421b06ca4397fbd1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62485da5c000bd8c6573a0f058d3fd2a015e6770f02b7f5421b06ca4397fbd1f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T03:55:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T03:55:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b9j4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://648c6689ca31c9d3d3468ca21c3942c0ec6951e3dca1aeff9b3b53d2b44d28d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://648c6689ca31c9d3d3468ca21c3942c0ec6951e3dca1aeff9b3b53d2b44d28d6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T03:55:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T03:55:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b9j4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8146030a1513c74faccf964a1d3cabf1f94a54960dca98e08ccdc2f6af693f7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8146030a1513c74faccf964a1d3cabf1f94a54960dca98e08ccdc2f6af693f7b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T03:55:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T03:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b9j4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://154d3303c5331d6e83a4e49d822a847e82d52e36f66ca374cfbae71ca5bafb06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://154d3303c5331d6e83a4e49d822a847e82d52e36f66ca374cfbae71ca5bafb06\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T03:55:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T03:55:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b9j4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4872a255883fda0c061cf1a430ef4d6de2bd3465540433734a6c76038f0126f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4872a255883fda0c061cf1a430ef4d6de2bd3465540433734a6c76038f0126f1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T03:55:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T03:55:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b9j4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b9j4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T03:55:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-q24vt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T03:55:43Z is after 2025-08-24T17:21:41Z" Oct 11 03:55:44 crc kubenswrapper[4703]: I1011 03:55:43.996694 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vgqng" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b6de354-b085-4f66-ac6c-4eb6005aa965\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d87e8dad79964dbc6e834c6da67c25f96267dc752ed1f3c8f1608ae7629d60e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-59z7g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T03:55:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vgqng\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T03:55:43Z is after 2025-08-24T17:21:41Z" Oct 11 03:55:44 crc kubenswrapper[4703]: I1011 03:55:44.020326 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T03:55:44Z is after 2025-08-24T17:21:41Z" Oct 11 03:55:44 crc kubenswrapper[4703]: I1011 03:55:44.030025 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-59fkr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8ee03a3c-bf16-41a7-b901-a5523ccfd389\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bf9e1af0fe5df7fabd9969f50806c97dc1b058a60cc9158c9ca9e9afe157073\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nqf2q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T03:55:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-59fkr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T03:55:44Z is after 2025-08-24T17:21:41Z" Oct 11 03:55:44 crc kubenswrapper[4703]: I1011 03:55:44.044787 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"899c5710-ccac-41a7-a51e-f13015a9b925\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d36cda809e465ae3cdbebd04ca1d0c880d982920b9e17e237e1c02da638b4b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa3ac63ddb093f123aeccd29dda14b9d42e3196818fe255e4315b2fd81e4d4bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://337d11e16f135a7996228291aa5aeb9dca3b0743c8d53ad358ad75a3f6ac604f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f1d0a644c02180fdd84bdf90488b6f2b54986bc290dc6ba0063e1ecb405800b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bdb20db3996ab197ac02919be77d320bf5f7ef288276b9f0788963b19a3d818\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1011 03:55:23.042098 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1011 03:55:23.055510 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1583286593/tls.crt::/tmp/serving-cert-1583286593/tls.key\\\\\\\"\\\\nI1011 03:55:28.420513 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1011 03:55:28.424215 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1011 03:55:28.424249 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1011 03:55:28.424289 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1011 03:55:28.424299 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1011 03:55:28.439370 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1011 03:55:28.439543 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1011 03:55:28.439608 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1011 03:55:28.439657 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1011 03:55:28.439755 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1011 03:55:28.439805 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1011 03:55:28.439852 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1011 03:55:28.439400 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1011 03:55:28.443344 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-11T03:55:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6665fdb421f1acf4f7a285ccb1adcdca9e7b6191f21ed8dd1ebfa21771a0cbc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbcccfdee4565b57a78cd594f8e68b9ee6f8afb6d329818c54058ef262ae1aa7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fbcccfdee4565b57a78cd594f8e68b9ee6f8afb6d329818c54058ef262ae1aa7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T03:55:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T03:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T03:55:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T03:55:44Z is after 2025-08-24T17:21:41Z" Oct 11 03:55:44 crc kubenswrapper[4703]: I1011 03:55:44.090419 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 03:55:44 crc kubenswrapper[4703]: I1011 03:55:44.090488 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 03:55:44 crc kubenswrapper[4703]: I1011 03:55:44.090501 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 03:55:44 crc kubenswrapper[4703]: I1011 03:55:44.090520 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 03:55:44 crc kubenswrapper[4703]: I1011 03:55:44.090530 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T03:55:44Z","lastTransitionTime":"2025-10-11T03:55:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 03:55:44 crc kubenswrapper[4703]: I1011 03:55:44.091568 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6b7d5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"74f832fc-6791-47d6-a9b3-07d923e053dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f96945fd0f7582822795213d47459caf01faf1e4d2a2edb5112939558c438969\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8vscm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4f73efcf3bef36e6452025ae44770270a644dd02150c7a3e760ab13c1488d22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8vscm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T03:55:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6b7d5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T03:55:44Z is after 2025-08-24T17:21:41Z" Oct 11 03:55:44 crc kubenswrapper[4703]: I1011 03:55:44.108673 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m4jc5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8090d9aa-59c5-4c77-a4c0-94f2fa8d4426\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc87197c1fda9865f81c76cb123bdf355d8fea22b3b3c3c1100a3bc375481b8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgh7c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b95e578049157330f58c94e99a339d76cb71eb6672860a9c1364ae204fdc5dc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgh7c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c7f1264e75bed906a827c0eeedcf3120809c53ede363c06629d4219ed82aa42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgh7c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f275d204a8d4c20a7e6b6e7897724b671daa662a53ec87bdef257935fe5fbb07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgh7c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5acab90dda80ca9a5f26446486eea0689403ad891b85d765af6a439181aa47f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgh7c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00e6bcaa58d0cbca2d738a616bb0a5bbf22d4563f2851f11e2c995b08a080789\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgh7c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de794e2f163f11d0639196abf1633be939f13e7a6925907982886f60f0b6decb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgh7c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7327b4c65a37d4d38fd797a79eada88617bce15e8ca9ea2145f1cd8b2e7a4ea5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgh7c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://295892d24c4de9ffda31b2be4cb00fdf1a108088677abebd05b7ab98c0f722f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://295892d24c4de9ffda31b2be4cb00fdf1a108088677abebd05b7ab98c0f722f5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T03:55:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T03:55:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgh7c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T03:55:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-m4jc5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T03:55:44Z is after 2025-08-24T17:21:41Z" Oct 11 03:55:44 crc kubenswrapper[4703]: I1011 03:55:44.125352 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7905d36e-5e82-411f-bbff-19fbde136cb3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c5b679c77d5f5c431108f51cdeecf410bd4f6f76c74c2580cd3c39898eec23e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8a0a051a04bd848843a5b40337140b947c17d8378b0efbcef7d507ecadcd40b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6798d26a9f6823db8508369a2228601bbce6ce6f60799acc543e3a9074321130\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4c454ab3f6ac3f4ddf483a0039e0ddcff628cc078dd736875b86becc2f51c0d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T03:55:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T03:55:44Z is after 2025-08-24T17:21:41Z" Oct 11 03:55:44 crc kubenswrapper[4703]: I1011 03:55:44.137027 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T03:55:44Z is after 2025-08-24T17:21:41Z" Oct 11 03:55:44 crc kubenswrapper[4703]: I1011 03:55:44.140332 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 11 03:55:44 crc kubenswrapper[4703]: I1011 03:55:44.140459 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 11 03:55:44 crc kubenswrapper[4703]: E1011 03:55:44.140569 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-11 03:56:00.140540652 +0000 UTC m=+51.351022574 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 11 03:55:44 crc kubenswrapper[4703]: E1011 03:55:44.140583 4703 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 11 03:55:44 crc kubenswrapper[4703]: I1011 03:55:44.140640 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 11 03:55:44 crc kubenswrapper[4703]: I1011 03:55:44.140721 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 11 03:55:44 crc kubenswrapper[4703]: E1011 03:55:44.140845 4703 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 11 03:55:44 crc kubenswrapper[4703]: E1011 03:55:44.140863 4703 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 11 03:55:44 crc kubenswrapper[4703]: E1011 03:55:44.140873 4703 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 11 03:55:44 crc kubenswrapper[4703]: E1011 03:55:44.140906 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-11 03:56:00.140898811 +0000 UTC m=+51.351380733 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 11 03:55:44 crc kubenswrapper[4703]: E1011 03:55:44.140940 4703 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 11 03:55:44 crc kubenswrapper[4703]: E1011 03:55:44.140958 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-11 03:56:00.140952972 +0000 UTC m=+51.351434894 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 11 03:55:44 crc kubenswrapper[4703]: E1011 03:55:44.140971 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-11 03:56:00.140965173 +0000 UTC m=+51.351447095 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 11 03:55:44 crc kubenswrapper[4703]: I1011 03:55:44.151589 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T03:55:44Z is after 2025-08-24T17:21:41Z" Oct 11 03:55:44 crc kubenswrapper[4703]: I1011 03:55:44.161927 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pxxpm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf5ff297-df57-4520-a699-dd9f0d3eb7f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70b7c324a45c1721cf18751603441ea7e9c3f31d9286985ee1b27a838cdce1d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-974bn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T03:55:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pxxpm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T03:55:44Z is after 2025-08-24T17:21:41Z" Oct 11 03:55:44 crc kubenswrapper[4703]: I1011 03:55:44.180554 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T03:55:44Z is after 2025-08-24T17:21:41Z" Oct 11 03:55:44 crc kubenswrapper[4703]: I1011 03:55:44.192420 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 03:55:44 crc kubenswrapper[4703]: I1011 03:55:44.192451 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 03:55:44 crc kubenswrapper[4703]: I1011 03:55:44.192461 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 03:55:44 crc kubenswrapper[4703]: I1011 03:55:44.192490 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 03:55:44 crc kubenswrapper[4703]: I1011 03:55:44.192501 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T03:55:44Z","lastTransitionTime":"2025-10-11T03:55:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 03:55:44 crc kubenswrapper[4703]: I1011 03:55:44.192531 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T03:55:44Z is after 2025-08-24T17:21:41Z" Oct 11 03:55:44 crc kubenswrapper[4703]: I1011 03:55:44.206809 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7312548a8aa43507e9c61cef8dc71d23e0cbc965b92c1c4c1fa578b294612f07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b22698b58047065f355a20c87e6a70f91d921ac2f8a4f651274eaa153e7f689d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T03:55:44Z is after 2025-08-24T17:21:41Z" Oct 11 03:55:44 crc kubenswrapper[4703]: I1011 03:55:44.219550 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-q24vt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2f31724-195d-43c1-8048-188d3646ca61\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b9j4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62485da5c000bd8c6573a0f058d3fd2a015e6770f02b7f5421b06ca4397fbd1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62485da5c000bd8c6573a0f058d3fd2a015e6770f02b7f5421b06ca4397fbd1f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T03:55:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T03:55:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b9j4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://648c6689ca31c9d3d3468ca21c3942c0ec6951e3dca1aeff9b3b53d2b44d28d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://648c6689ca31c9d3d3468ca21c3942c0ec6951e3dca1aeff9b3b53d2b44d28d6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T03:55:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T03:55:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b9j4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8146030a1513c74faccf964a1d3cabf1f94a54960dca98e08ccdc2f6af693f7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8146030a1513c74faccf964a1d3cabf1f94a54960dca98e08ccdc2f6af693f7b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T03:55:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T03:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b9j4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://154d3303c5331d6e83a4e49d822a847e82d52e36f66ca374cfbae71ca5bafb06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://154d3303c5331d6e83a4e49d822a847e82d52e36f66ca374cfbae71ca5bafb06\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T03:55:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T03:55:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b9j4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4872a255883fda0c061cf1a430ef4d6de2bd3465540433734a6c76038f0126f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4872a255883fda0c061cf1a430ef4d6de2bd3465540433734a6c76038f0126f1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T03:55:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T03:55:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b9j4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28d55d83b423ac444f6edbee0712cc90b10c9b714407b03f5d9a799be53a7cb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28d55d83b423ac444f6edbee0712cc90b10c9b714407b03f5d9a799be53a7cb5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T03:55:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T03:55:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b9j4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T03:55:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-q24vt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T03:55:44Z is after 2025-08-24T17:21:41Z" Oct 11 03:55:44 crc kubenswrapper[4703]: I1011 03:55:44.230629 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vgqng" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b6de354-b085-4f66-ac6c-4eb6005aa965\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d87e8dad79964dbc6e834c6da67c25f96267dc752ed1f3c8f1608ae7629d60e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-59z7g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T03:55:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vgqng\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T03:55:44Z is after 2025-08-24T17:21:41Z" Oct 11 03:55:44 crc kubenswrapper[4703]: I1011 03:55:44.240378 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://710af3ad23aecd87b74b285a5ac1a0263bb9ddb18f7f9032ecb512625b24c7bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T03:55:44Z is after 2025-08-24T17:21:41Z" Oct 11 03:55:44 crc kubenswrapper[4703]: I1011 03:55:44.254557 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-59fkr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8ee03a3c-bf16-41a7-b901-a5523ccfd389\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bf9e1af0fe5df7fabd9969f50806c97dc1b058a60cc9158c9ca9e9afe157073\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nqf2q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T03:55:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-59fkr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T03:55:44Z is after 2025-08-24T17:21:41Z" Oct 11 03:55:44 crc kubenswrapper[4703]: I1011 03:55:44.281964 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"899c5710-ccac-41a7-a51e-f13015a9b925\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d36cda809e465ae3cdbebd04ca1d0c880d982920b9e17e237e1c02da638b4b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa3ac63ddb093f123aeccd29dda14b9d42e3196818fe255e4315b2fd81e4d4bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://337d11e16f135a7996228291aa5aeb9dca3b0743c8d53ad358ad75a3f6ac604f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f1d0a644c02180fdd84bdf90488b6f2b54986bc290dc6ba0063e1ecb405800b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bdb20db3996ab197ac02919be77d320bf5f7ef288276b9f0788963b19a3d818\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1011 03:55:23.042098 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1011 03:55:23.055510 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1583286593/tls.crt::/tmp/serving-cert-1583286593/tls.key\\\\\\\"\\\\nI1011 03:55:28.420513 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1011 03:55:28.424215 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1011 03:55:28.424249 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1011 03:55:28.424289 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1011 03:55:28.424299 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1011 03:55:28.439370 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1011 03:55:28.439543 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1011 03:55:28.439608 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1011 03:55:28.439657 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1011 03:55:28.439755 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1011 03:55:28.439805 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1011 03:55:28.439852 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1011 03:55:28.439400 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1011 03:55:28.443344 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-11T03:55:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6665fdb421f1acf4f7a285ccb1adcdca9e7b6191f21ed8dd1ebfa21771a0cbc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbcccfdee4565b57a78cd594f8e68b9ee6f8afb6d329818c54058ef262ae1aa7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fbcccfdee4565b57a78cd594f8e68b9ee6f8afb6d329818c54058ef262ae1aa7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T03:55:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T03:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T03:55:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T03:55:44Z is after 2025-08-24T17:21:41Z" Oct 11 03:55:44 crc kubenswrapper[4703]: I1011 03:55:44.294449 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 03:55:44 crc kubenswrapper[4703]: I1011 03:55:44.294498 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 03:55:44 crc kubenswrapper[4703]: I1011 03:55:44.294508 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 03:55:44 crc kubenswrapper[4703]: I1011 03:55:44.294524 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 03:55:44 crc kubenswrapper[4703]: I1011 03:55:44.294535 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T03:55:44Z","lastTransitionTime":"2025-10-11T03:55:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 03:55:44 crc kubenswrapper[4703]: I1011 03:55:44.295847 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6b7d5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"74f832fc-6791-47d6-a9b3-07d923e053dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f96945fd0f7582822795213d47459caf01faf1e4d2a2edb5112939558c438969\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8vscm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4f73efcf3bef36e6452025ae44770270a644dd02150c7a3e760ab13c1488d22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8vscm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T03:55:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6b7d5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T03:55:44Z is after 2025-08-24T17:21:41Z" Oct 11 03:55:44 crc kubenswrapper[4703]: I1011 03:55:44.320331 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m4jc5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8090d9aa-59c5-4c77-a4c0-94f2fa8d4426\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc87197c1fda9865f81c76cb123bdf355d8fea22b3b3c3c1100a3bc375481b8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgh7c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b95e578049157330f58c94e99a339d76cb71eb6672860a9c1364ae204fdc5dc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgh7c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c7f1264e75bed906a827c0eeedcf3120809c53ede363c06629d4219ed82aa42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgh7c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f275d204a8d4c20a7e6b6e7897724b671daa662a53ec87bdef257935fe5fbb07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgh7c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5acab90dda80ca9a5f26446486eea0689403ad891b85d765af6a439181aa47f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgh7c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00e6bcaa58d0cbca2d738a616bb0a5bbf22d4563f2851f11e2c995b08a080789\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgh7c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de794e2f163f11d0639196abf1633be939f13e7a6925907982886f60f0b6decb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgh7c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7327b4c65a37d4d38fd797a79eada88617bce15e8ca9ea2145f1cd8b2e7a4ea5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgh7c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://295892d24c4de9ffda31b2be4cb00fdf1a108088677abebd05b7ab98c0f722f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://295892d24c4de9ffda31b2be4cb00fdf1a108088677abebd05b7ab98c0f722f5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T03:55:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T03:55:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgh7c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T03:55:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-m4jc5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T03:55:44Z is after 2025-08-24T17:21:41Z" Oct 11 03:55:44 crc kubenswrapper[4703]: I1011 03:55:44.343249 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 11 03:55:44 crc kubenswrapper[4703]: E1011 03:55:44.343427 4703 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 11 03:55:44 crc kubenswrapper[4703]: E1011 03:55:44.343447 4703 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 11 03:55:44 crc kubenswrapper[4703]: E1011 03:55:44.343462 4703 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 11 03:55:44 crc kubenswrapper[4703]: E1011 03:55:44.343537 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-11 03:56:00.343519328 +0000 UTC m=+51.554001250 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 11 03:55:44 crc kubenswrapper[4703]: I1011 03:55:44.397174 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 03:55:44 crc kubenswrapper[4703]: I1011 03:55:44.397246 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 03:55:44 crc kubenswrapper[4703]: I1011 03:55:44.397265 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 03:55:44 crc kubenswrapper[4703]: I1011 03:55:44.397290 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 03:55:44 crc kubenswrapper[4703]: I1011 03:55:44.397309 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T03:55:44Z","lastTransitionTime":"2025-10-11T03:55:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 03:55:44 crc kubenswrapper[4703]: I1011 03:55:44.500082 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 03:55:44 crc kubenswrapper[4703]: I1011 03:55:44.500134 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 03:55:44 crc kubenswrapper[4703]: I1011 03:55:44.500151 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 03:55:44 crc kubenswrapper[4703]: I1011 03:55:44.500176 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 03:55:44 crc kubenswrapper[4703]: I1011 03:55:44.500195 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T03:55:44Z","lastTransitionTime":"2025-10-11T03:55:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 03:55:44 crc kubenswrapper[4703]: I1011 03:55:44.532866 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 11 03:55:44 crc kubenswrapper[4703]: I1011 03:55:44.532866 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 11 03:55:44 crc kubenswrapper[4703]: I1011 03:55:44.532986 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 11 03:55:44 crc kubenswrapper[4703]: E1011 03:55:44.533100 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 11 03:55:44 crc kubenswrapper[4703]: E1011 03:55:44.533273 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 11 03:55:44 crc kubenswrapper[4703]: E1011 03:55:44.533460 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 11 03:55:44 crc kubenswrapper[4703]: I1011 03:55:44.602539 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 03:55:44 crc kubenswrapper[4703]: I1011 03:55:44.602575 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 03:55:44 crc kubenswrapper[4703]: I1011 03:55:44.602585 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 03:55:44 crc kubenswrapper[4703]: I1011 03:55:44.602602 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 03:55:44 crc kubenswrapper[4703]: I1011 03:55:44.602614 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T03:55:44Z","lastTransitionTime":"2025-10-11T03:55:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 03:55:44 crc kubenswrapper[4703]: I1011 03:55:44.705282 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 03:55:44 crc kubenswrapper[4703]: I1011 03:55:44.705605 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 03:55:44 crc kubenswrapper[4703]: I1011 03:55:44.705619 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 03:55:44 crc kubenswrapper[4703]: I1011 03:55:44.705636 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 03:55:44 crc kubenswrapper[4703]: I1011 03:55:44.705647 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T03:55:44Z","lastTransitionTime":"2025-10-11T03:55:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 03:55:44 crc kubenswrapper[4703]: I1011 03:55:44.807875 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 03:55:44 crc kubenswrapper[4703]: I1011 03:55:44.807915 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 03:55:44 crc kubenswrapper[4703]: I1011 03:55:44.807923 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 03:55:44 crc kubenswrapper[4703]: I1011 03:55:44.807939 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 03:55:44 crc kubenswrapper[4703]: I1011 03:55:44.807949 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T03:55:44Z","lastTransitionTime":"2025-10-11T03:55:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 03:55:44 crc kubenswrapper[4703]: I1011 03:55:44.860294 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-q24vt" event={"ID":"c2f31724-195d-43c1-8048-188d3646ca61","Type":"ContainerStarted","Data":"cb449ed7c41f1b6a753e4d9f5fb9f919b523b4a2af1e466521a5cf6a4874361e"} Oct 11 03:55:44 crc kubenswrapper[4703]: I1011 03:55:44.860335 4703 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 11 03:55:44 crc kubenswrapper[4703]: I1011 03:55:44.874847 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T03:55:44Z is after 2025-08-24T17:21:41Z" Oct 11 03:55:44 crc kubenswrapper[4703]: I1011 03:55:44.886998 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T03:55:44Z is after 2025-08-24T17:21:41Z" Oct 11 03:55:44 crc kubenswrapper[4703]: I1011 03:55:44.900320 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pxxpm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf5ff297-df57-4520-a699-dd9f0d3eb7f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70b7c324a45c1721cf18751603441ea7e9c3f31d9286985ee1b27a838cdce1d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-974bn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T03:55:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pxxpm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T03:55:44Z is after 2025-08-24T17:21:41Z" Oct 11 03:55:44 crc kubenswrapper[4703]: I1011 03:55:44.910543 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 03:55:44 crc kubenswrapper[4703]: I1011 03:55:44.910593 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 03:55:44 crc kubenswrapper[4703]: I1011 03:55:44.910608 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 03:55:44 crc kubenswrapper[4703]: I1011 03:55:44.910627 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 03:55:44 crc kubenswrapper[4703]: I1011 03:55:44.910640 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T03:55:44Z","lastTransitionTime":"2025-10-11T03:55:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 03:55:44 crc kubenswrapper[4703]: I1011 03:55:44.915663 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7905d36e-5e82-411f-bbff-19fbde136cb3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c5b679c77d5f5c431108f51cdeecf410bd4f6f76c74c2580cd3c39898eec23e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8a0a051a04bd848843a5b40337140b947c17d8378b0efbcef7d507ecadcd40b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6798d26a9f6823db8508369a2228601bbce6ce6f60799acc543e3a9074321130\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4c454ab3f6ac3f4ddf483a0039e0ddcff628cc078dd736875b86becc2f51c0d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T03:55:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T03:55:44Z is after 2025-08-24T17:21:41Z" Oct 11 03:55:44 crc kubenswrapper[4703]: I1011 03:55:44.932687 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7312548a8aa43507e9c61cef8dc71d23e0cbc965b92c1c4c1fa578b294612f07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b22698b58047065f355a20c87e6a70f91d921ac2f8a4f651274eaa153e7f689d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T03:55:44Z is after 2025-08-24T17:21:41Z" Oct 11 03:55:44 crc kubenswrapper[4703]: I1011 03:55:44.952282 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-q24vt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2f31724-195d-43c1-8048-188d3646ca61\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T03:55:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb449ed7c41f1b6a753e4d9f5fb9f919b523b4a2af1e466521a5cf6a4874361e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T03:55:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b9j4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62485da5c000bd8c6573a0f058d3fd2a015e6770f02b7f5421b06ca4397fbd1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62485da5c000bd8c6573a0f058d3fd2a015e6770f02b7f5421b06ca4397fbd1f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T03:55:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T03:55:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b9j4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://648c6689ca31c9d3d3468ca21c3942c0ec6951e3dca1aeff9b3b53d2b44d28d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://648c6689ca31c9d3d3468ca21c3942c0ec6951e3dca1aeff9b3b53d2b44d28d6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T03:55:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T03:55:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b9j4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8146030a1513c74faccf964a1d3cabf1f94a54960dca98e08ccdc2f6af693f7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8146030a1513c74faccf964a1d3cabf1f94a54960dca98e08ccdc2f6af693f7b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T03:55:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T03:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b9j4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://154d3303c5331d6e83a4e49d822a847e82d52e36f66ca374cfbae71ca5bafb06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://154d3303c5331d6e83a4e49d822a847e82d52e36f66ca374cfbae71ca5bafb06\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T03:55:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T03:55:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b9j4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4872a255883fda0c061cf1a430ef4d6de2bd3465540433734a6c76038f0126f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4872a255883fda0c061cf1a430ef4d6de2bd3465540433734a6c76038f0126f1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T03:55:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T03:55:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b9j4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28d55d83b423ac444f6edbee0712cc90b10c9b714407b03f5d9a799be53a7cb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28d55d83b423ac444f6edbee0712cc90b10c9b714407b03f5d9a799be53a7cb5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T03:55:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T03:55:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b9j4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T03:55:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-q24vt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T03:55:44Z is after 2025-08-24T17:21:41Z" Oct 11 03:55:45 crc kubenswrapper[4703]: I1011 03:55:44.999858 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-vgqng" podStartSLOduration=9.999826705 podStartE2EDuration="9.999826705s" podCreationTimestamp="2025-10-11 03:55:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 03:55:44.98084332 +0000 UTC m=+36.191325292" watchObservedRunningTime="2025-10-11 03:55:44.999826705 +0000 UTC m=+36.210308667" Oct 11 03:55:45 crc kubenswrapper[4703]: I1011 03:55:45.013900 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 03:55:45 crc kubenswrapper[4703]: I1011 03:55:45.013962 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 03:55:45 crc kubenswrapper[4703]: I1011 03:55:45.013987 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 03:55:45 crc kubenswrapper[4703]: I1011 03:55:45.014019 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 03:55:45 crc kubenswrapper[4703]: I1011 03:55:45.014043 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T03:55:45Z","lastTransitionTime":"2025-10-11T03:55:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 03:55:45 crc kubenswrapper[4703]: I1011 03:55:45.085592 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 03:55:45 crc kubenswrapper[4703]: I1011 03:55:45.085653 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 03:55:45 crc kubenswrapper[4703]: I1011 03:55:45.085671 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 03:55:45 crc kubenswrapper[4703]: I1011 03:55:45.085695 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 03:55:45 crc kubenswrapper[4703]: I1011 03:55:45.085713 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T03:55:45Z","lastTransitionTime":"2025-10-11T03:55:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 03:55:45 crc kubenswrapper[4703]: I1011 03:55:45.104962 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-59fkr" podStartSLOduration=10.104936499 podStartE2EDuration="10.104936499s" podCreationTimestamp="2025-10-11 03:55:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 03:55:45.065449749 +0000 UTC m=+36.275931711" watchObservedRunningTime="2025-10-11 03:55:45.104936499 +0000 UTC m=+36.315418441" Oct 11 03:55:45 crc kubenswrapper[4703]: I1011 03:55:45.126694 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 03:55:45 crc kubenswrapper[4703]: I1011 03:55:45.126729 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 03:55:45 crc kubenswrapper[4703]: I1011 03:55:45.126739 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 03:55:45 crc kubenswrapper[4703]: I1011 03:55:45.126756 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 03:55:45 crc kubenswrapper[4703]: I1011 03:55:45.126768 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T03:55:45Z","lastTransitionTime":"2025-10-11T03:55:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 03:55:45 crc kubenswrapper[4703]: I1011 03:55:45.149183 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-m4jc5" podStartSLOduration=10.149164584 podStartE2EDuration="10.149164584s" podCreationTimestamp="2025-10-11 03:55:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 03:55:45.106058908 +0000 UTC m=+36.316540840" watchObservedRunningTime="2025-10-11 03:55:45.149164584 +0000 UTC m=+36.359646506" Oct 11 03:55:45 crc kubenswrapper[4703]: I1011 03:55:45.176573 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=17.176548393 podStartE2EDuration="17.176548393s" podCreationTimestamp="2025-10-11 03:55:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 03:55:45.149874874 +0000 UTC m=+36.360356826" watchObservedRunningTime="2025-10-11 03:55:45.176548393 +0000 UTC m=+36.387030355" Oct 11 03:55:45 crc kubenswrapper[4703]: I1011 03:55:45.177016 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-6b7d5" podStartSLOduration=10.177008755 podStartE2EDuration="10.177008755s" podCreationTimestamp="2025-10-11 03:55:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 03:55:45.176748298 +0000 UTC m=+36.387230220" watchObservedRunningTime="2025-10-11 03:55:45.177008755 +0000 UTC m=+36.387490717" Oct 11 03:55:45 crc kubenswrapper[4703]: I1011 03:55:45.285137 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6khdw"] Oct 11 03:55:45 crc kubenswrapper[4703]: I1011 03:55:45.285535 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6khdw" Oct 11 03:55:45 crc kubenswrapper[4703]: I1011 03:55:45.287032 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Oct 11 03:55:45 crc kubenswrapper[4703]: I1011 03:55:45.287522 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Oct 11 03:55:45 crc kubenswrapper[4703]: I1011 03:55:45.308494 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=17.308456049 podStartE2EDuration="17.308456049s" podCreationTimestamp="2025-10-11 03:55:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 03:55:45.308275225 +0000 UTC m=+36.518757147" watchObservedRunningTime="2025-10-11 03:55:45.308456049 +0000 UTC m=+36.518937971" Oct 11 03:55:45 crc kubenswrapper[4703]: I1011 03:55:45.313443 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-jksjj"] Oct 11 03:55:45 crc kubenswrapper[4703]: I1011 03:55:45.313836 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jksjj" Oct 11 03:55:45 crc kubenswrapper[4703]: I1011 03:55:45.314140 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-4s5kf"] Oct 11 03:55:45 crc kubenswrapper[4703]: I1011 03:55:45.315845 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Oct 11 03:55:45 crc kubenswrapper[4703]: I1011 03:55:45.316023 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Oct 11 03:55:45 crc kubenswrapper[4703]: I1011 03:55:45.316020 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Oct 11 03:55:45 crc kubenswrapper[4703]: I1011 03:55:45.316102 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Oct 11 03:55:45 crc kubenswrapper[4703]: I1011 03:55:45.316284 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4s5kf" Oct 11 03:55:45 crc kubenswrapper[4703]: E1011 03:55:45.316359 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4s5kf" podUID="8c8bf83f-ab19-43ea-b331-db24e4e97709" Oct 11 03:55:45 crc kubenswrapper[4703]: I1011 03:55:45.349564 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-pxxpm" podStartSLOduration=10.34951508 podStartE2EDuration="10.34951508s" podCreationTimestamp="2025-10-11 03:55:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 03:55:45.348429192 +0000 UTC m=+36.558911144" watchObservedRunningTime="2025-10-11 03:55:45.34951508 +0000 UTC m=+36.559997042" Oct 11 03:55:45 crc kubenswrapper[4703]: I1011 03:55:45.354812 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m9qj6\" (UniqueName: \"kubernetes.io/projected/8b22979d-3055-4cbb-987d-abc24656bd97-kube-api-access-m9qj6\") pod \"ovnkube-control-plane-749d76644c-6khdw\" (UID: \"8b22979d-3055-4cbb-987d-abc24656bd97\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6khdw" Oct 11 03:55:45 crc kubenswrapper[4703]: I1011 03:55:45.354892 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/8b22979d-3055-4cbb-987d-abc24656bd97-env-overrides\") pod \"ovnkube-control-plane-749d76644c-6khdw\" (UID: \"8b22979d-3055-4cbb-987d-abc24656bd97\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6khdw" Oct 11 03:55:45 crc kubenswrapper[4703]: I1011 03:55:45.354947 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/8b22979d-3055-4cbb-987d-abc24656bd97-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-6khdw\" (UID: \"8b22979d-3055-4cbb-987d-abc24656bd97\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6khdw" Oct 11 03:55:45 crc kubenswrapper[4703]: I1011 03:55:45.354988 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/8b22979d-3055-4cbb-987d-abc24656bd97-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-6khdw\" (UID: \"8b22979d-3055-4cbb-987d-abc24656bd97\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6khdw" Oct 11 03:55:45 crc kubenswrapper[4703]: I1011 03:55:45.397845 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-q24vt" podStartSLOduration=10.397814425 podStartE2EDuration="10.397814425s" podCreationTimestamp="2025-10-11 03:55:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 03:55:45.397735303 +0000 UTC m=+36.608217235" watchObservedRunningTime="2025-10-11 03:55:45.397814425 +0000 UTC m=+36.608296377" Oct 11 03:55:45 crc kubenswrapper[4703]: I1011 03:55:45.455650 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/63299867-4fab-4ae5-9e3d-86c5eeab5053-service-ca\") pod \"cluster-version-operator-5c965bbfc6-jksjj\" (UID: \"63299867-4fab-4ae5-9e3d-86c5eeab5053\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jksjj" Oct 11 03:55:45 crc kubenswrapper[4703]: I1011 03:55:45.455728 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/63299867-4fab-4ae5-9e3d-86c5eeab5053-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-jksjj\" (UID: \"63299867-4fab-4ae5-9e3d-86c5eeab5053\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jksjj" Oct 11 03:55:45 crc kubenswrapper[4703]: I1011 03:55:45.455755 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2hx9q\" (UniqueName: \"kubernetes.io/projected/8c8bf83f-ab19-43ea-b331-db24e4e97709-kube-api-access-2hx9q\") pod \"network-metrics-daemon-4s5kf\" (UID: \"8c8bf83f-ab19-43ea-b331-db24e4e97709\") " pod="openshift-multus/network-metrics-daemon-4s5kf" Oct 11 03:55:45 crc kubenswrapper[4703]: I1011 03:55:45.455786 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m9qj6\" (UniqueName: \"kubernetes.io/projected/8b22979d-3055-4cbb-987d-abc24656bd97-kube-api-access-m9qj6\") pod \"ovnkube-control-plane-749d76644c-6khdw\" (UID: \"8b22979d-3055-4cbb-987d-abc24656bd97\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6khdw" Oct 11 03:55:45 crc kubenswrapper[4703]: I1011 03:55:45.455816 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/8b22979d-3055-4cbb-987d-abc24656bd97-env-overrides\") pod \"ovnkube-control-plane-749d76644c-6khdw\" (UID: \"8b22979d-3055-4cbb-987d-abc24656bd97\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6khdw" Oct 11 03:55:45 crc kubenswrapper[4703]: I1011 03:55:45.455842 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8c8bf83f-ab19-43ea-b331-db24e4e97709-metrics-certs\") pod \"network-metrics-daemon-4s5kf\" (UID: \"8c8bf83f-ab19-43ea-b331-db24e4e97709\") " pod="openshift-multus/network-metrics-daemon-4s5kf" Oct 11 03:55:45 crc kubenswrapper[4703]: I1011 03:55:45.455875 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/8b22979d-3055-4cbb-987d-abc24656bd97-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-6khdw\" (UID: \"8b22979d-3055-4cbb-987d-abc24656bd97\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6khdw" Oct 11 03:55:45 crc kubenswrapper[4703]: I1011 03:55:45.455896 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/8b22979d-3055-4cbb-987d-abc24656bd97-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-6khdw\" (UID: \"8b22979d-3055-4cbb-987d-abc24656bd97\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6khdw" Oct 11 03:55:45 crc kubenswrapper[4703]: I1011 03:55:45.455924 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/63299867-4fab-4ae5-9e3d-86c5eeab5053-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-jksjj\" (UID: \"63299867-4fab-4ae5-9e3d-86c5eeab5053\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jksjj" Oct 11 03:55:45 crc kubenswrapper[4703]: I1011 03:55:45.455949 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/63299867-4fab-4ae5-9e3d-86c5eeab5053-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-jksjj\" (UID: \"63299867-4fab-4ae5-9e3d-86c5eeab5053\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jksjj" Oct 11 03:55:45 crc kubenswrapper[4703]: I1011 03:55:45.455971 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/63299867-4fab-4ae5-9e3d-86c5eeab5053-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-jksjj\" (UID: \"63299867-4fab-4ae5-9e3d-86c5eeab5053\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jksjj" Oct 11 03:55:45 crc kubenswrapper[4703]: I1011 03:55:45.456709 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/8b22979d-3055-4cbb-987d-abc24656bd97-env-overrides\") pod \"ovnkube-control-plane-749d76644c-6khdw\" (UID: \"8b22979d-3055-4cbb-987d-abc24656bd97\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6khdw" Oct 11 03:55:45 crc kubenswrapper[4703]: I1011 03:55:45.457272 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/8b22979d-3055-4cbb-987d-abc24656bd97-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-6khdw\" (UID: \"8b22979d-3055-4cbb-987d-abc24656bd97\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6khdw" Oct 11 03:55:45 crc kubenswrapper[4703]: I1011 03:55:45.468711 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/8b22979d-3055-4cbb-987d-abc24656bd97-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-6khdw\" (UID: \"8b22979d-3055-4cbb-987d-abc24656bd97\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6khdw" Oct 11 03:55:45 crc kubenswrapper[4703]: I1011 03:55:45.472085 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m9qj6\" (UniqueName: \"kubernetes.io/projected/8b22979d-3055-4cbb-987d-abc24656bd97-kube-api-access-m9qj6\") pod \"ovnkube-control-plane-749d76644c-6khdw\" (UID: \"8b22979d-3055-4cbb-987d-abc24656bd97\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6khdw" Oct 11 03:55:45 crc kubenswrapper[4703]: I1011 03:55:45.556416 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8c8bf83f-ab19-43ea-b331-db24e4e97709-metrics-certs\") pod \"network-metrics-daemon-4s5kf\" (UID: \"8c8bf83f-ab19-43ea-b331-db24e4e97709\") " pod="openshift-multus/network-metrics-daemon-4s5kf" Oct 11 03:55:45 crc kubenswrapper[4703]: I1011 03:55:45.556454 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/63299867-4fab-4ae5-9e3d-86c5eeab5053-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-jksjj\" (UID: \"63299867-4fab-4ae5-9e3d-86c5eeab5053\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jksjj" Oct 11 03:55:45 crc kubenswrapper[4703]: I1011 03:55:45.556485 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/63299867-4fab-4ae5-9e3d-86c5eeab5053-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-jksjj\" (UID: \"63299867-4fab-4ae5-9e3d-86c5eeab5053\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jksjj" Oct 11 03:55:45 crc kubenswrapper[4703]: I1011 03:55:45.556506 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/63299867-4fab-4ae5-9e3d-86c5eeab5053-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-jksjj\" (UID: \"63299867-4fab-4ae5-9e3d-86c5eeab5053\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jksjj" Oct 11 03:55:45 crc kubenswrapper[4703]: I1011 03:55:45.556554 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/63299867-4fab-4ae5-9e3d-86c5eeab5053-service-ca\") pod \"cluster-version-operator-5c965bbfc6-jksjj\" (UID: \"63299867-4fab-4ae5-9e3d-86c5eeab5053\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jksjj" Oct 11 03:55:45 crc kubenswrapper[4703]: I1011 03:55:45.556586 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/63299867-4fab-4ae5-9e3d-86c5eeab5053-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-jksjj\" (UID: \"63299867-4fab-4ae5-9e3d-86c5eeab5053\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jksjj" Oct 11 03:55:45 crc kubenswrapper[4703]: I1011 03:55:45.556601 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2hx9q\" (UniqueName: \"kubernetes.io/projected/8c8bf83f-ab19-43ea-b331-db24e4e97709-kube-api-access-2hx9q\") pod \"network-metrics-daemon-4s5kf\" (UID: \"8c8bf83f-ab19-43ea-b331-db24e4e97709\") " pod="openshift-multus/network-metrics-daemon-4s5kf" Oct 11 03:55:45 crc kubenswrapper[4703]: I1011 03:55:45.556855 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/63299867-4fab-4ae5-9e3d-86c5eeab5053-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-jksjj\" (UID: \"63299867-4fab-4ae5-9e3d-86c5eeab5053\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jksjj" Oct 11 03:55:45 crc kubenswrapper[4703]: I1011 03:55:45.556883 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/63299867-4fab-4ae5-9e3d-86c5eeab5053-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-jksjj\" (UID: \"63299867-4fab-4ae5-9e3d-86c5eeab5053\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jksjj" Oct 11 03:55:45 crc kubenswrapper[4703]: I1011 03:55:45.557578 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/63299867-4fab-4ae5-9e3d-86c5eeab5053-service-ca\") pod \"cluster-version-operator-5c965bbfc6-jksjj\" (UID: \"63299867-4fab-4ae5-9e3d-86c5eeab5053\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jksjj" Oct 11 03:55:45 crc kubenswrapper[4703]: E1011 03:55:45.557806 4703 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 11 03:55:45 crc kubenswrapper[4703]: E1011 03:55:45.557850 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8c8bf83f-ab19-43ea-b331-db24e4e97709-metrics-certs podName:8c8bf83f-ab19-43ea-b331-db24e4e97709 nodeName:}" failed. No retries permitted until 2025-10-11 03:55:46.057836499 +0000 UTC m=+37.268318431 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8c8bf83f-ab19-43ea-b331-db24e4e97709-metrics-certs") pod "network-metrics-daemon-4s5kf" (UID: "8c8bf83f-ab19-43ea-b331-db24e4e97709") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 11 03:55:45 crc kubenswrapper[4703]: I1011 03:55:45.570014 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/63299867-4fab-4ae5-9e3d-86c5eeab5053-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-jksjj\" (UID: \"63299867-4fab-4ae5-9e3d-86c5eeab5053\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jksjj" Oct 11 03:55:45 crc kubenswrapper[4703]: I1011 03:55:45.577959 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2hx9q\" (UniqueName: \"kubernetes.io/projected/8c8bf83f-ab19-43ea-b331-db24e4e97709-kube-api-access-2hx9q\") pod \"network-metrics-daemon-4s5kf\" (UID: \"8c8bf83f-ab19-43ea-b331-db24e4e97709\") " pod="openshift-multus/network-metrics-daemon-4s5kf" Oct 11 03:55:45 crc kubenswrapper[4703]: I1011 03:55:45.586431 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/63299867-4fab-4ae5-9e3d-86c5eeab5053-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-jksjj\" (UID: \"63299867-4fab-4ae5-9e3d-86c5eeab5053\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jksjj" Oct 11 03:55:45 crc kubenswrapper[4703]: I1011 03:55:45.605862 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6khdw" Oct 11 03:55:45 crc kubenswrapper[4703]: I1011 03:55:45.626677 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jksjj" Oct 11 03:55:45 crc kubenswrapper[4703]: W1011 03:55:45.642362 4703 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod63299867_4fab_4ae5_9e3d_86c5eeab5053.slice/crio-6ad13826869bbf4826605f0ce83fe4543efc7aec1783d9895563586624433c58 WatchSource:0}: Error finding container 6ad13826869bbf4826605f0ce83fe4543efc7aec1783d9895563586624433c58: Status 404 returned error can't find the container with id 6ad13826869bbf4826605f0ce83fe4543efc7aec1783d9895563586624433c58 Oct 11 03:55:45 crc kubenswrapper[4703]: I1011 03:55:45.862857 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6khdw" event={"ID":"8b22979d-3055-4cbb-987d-abc24656bd97","Type":"ContainerStarted","Data":"4946e6b51e0fa080cf13c972d5e9531cc67de9dfd30f6c29e387c8d5e0712bff"} Oct 11 03:55:45 crc kubenswrapper[4703]: I1011 03:55:45.863822 4703 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 11 03:55:45 crc kubenswrapper[4703]: I1011 03:55:45.864711 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jksjj" event={"ID":"63299867-4fab-4ae5-9e3d-86c5eeab5053","Type":"ContainerStarted","Data":"6ad13826869bbf4826605f0ce83fe4543efc7aec1783d9895563586624433c58"} Oct 11 03:55:46 crc kubenswrapper[4703]: I1011 03:55:46.063196 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8c8bf83f-ab19-43ea-b331-db24e4e97709-metrics-certs\") pod \"network-metrics-daemon-4s5kf\" (UID: \"8c8bf83f-ab19-43ea-b331-db24e4e97709\") " pod="openshift-multus/network-metrics-daemon-4s5kf" Oct 11 03:55:46 crc kubenswrapper[4703]: E1011 03:55:46.063372 4703 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 11 03:55:46 crc kubenswrapper[4703]: E1011 03:55:46.063446 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8c8bf83f-ab19-43ea-b331-db24e4e97709-metrics-certs podName:8c8bf83f-ab19-43ea-b331-db24e4e97709 nodeName:}" failed. No retries permitted until 2025-10-11 03:55:47.063422939 +0000 UTC m=+38.273904861 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8c8bf83f-ab19-43ea-b331-db24e4e97709-metrics-certs") pod "network-metrics-daemon-4s5kf" (UID: "8c8bf83f-ab19-43ea-b331-db24e4e97709") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 11 03:55:46 crc kubenswrapper[4703]: I1011 03:55:46.267200 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 11 03:55:46 crc kubenswrapper[4703]: I1011 03:55:46.401296 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-4s5kf"] Oct 11 03:55:46 crc kubenswrapper[4703]: I1011 03:55:46.401615 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4s5kf" Oct 11 03:55:46 crc kubenswrapper[4703]: E1011 03:55:46.401690 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4s5kf" podUID="8c8bf83f-ab19-43ea-b331-db24e4e97709" Oct 11 03:55:46 crc kubenswrapper[4703]: I1011 03:55:46.533529 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 11 03:55:46 crc kubenswrapper[4703]: I1011 03:55:46.533593 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 11 03:55:46 crc kubenswrapper[4703]: I1011 03:55:46.533547 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 11 03:55:46 crc kubenswrapper[4703]: E1011 03:55:46.533667 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 11 03:55:46 crc kubenswrapper[4703]: E1011 03:55:46.533758 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 11 03:55:46 crc kubenswrapper[4703]: E1011 03:55:46.533834 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 11 03:55:46 crc kubenswrapper[4703]: I1011 03:55:46.868792 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jksjj" event={"ID":"63299867-4fab-4ae5-9e3d-86c5eeab5053","Type":"ContainerStarted","Data":"63a2452637d13a76f7ad5a77f2c6c7815b34b9c10a17717a5ba400c354a909be"} Oct 11 03:55:46 crc kubenswrapper[4703]: I1011 03:55:46.871211 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6khdw" event={"ID":"8b22979d-3055-4cbb-987d-abc24656bd97","Type":"ContainerStarted","Data":"1c258da785096c5b56721f495f1b522c9cc5f64def1282b6f05391b9f4b3541c"} Oct 11 03:55:46 crc kubenswrapper[4703]: I1011 03:55:46.871274 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6khdw" event={"ID":"8b22979d-3055-4cbb-987d-abc24656bd97","Type":"ContainerStarted","Data":"1875544453f8057a267caac2162ea83501fe1e293a73ebd462d49f7a2c5ba17d"} Oct 11 03:55:46 crc kubenswrapper[4703]: I1011 03:55:46.888904 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jksjj" podStartSLOduration=11.888881193 podStartE2EDuration="11.888881193s" podCreationTimestamp="2025-10-11 03:55:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 03:55:46.888553935 +0000 UTC m=+38.099035887" watchObservedRunningTime="2025-10-11 03:55:46.888881193 +0000 UTC m=+38.099363115" Oct 11 03:55:47 crc kubenswrapper[4703]: I1011 03:55:47.075983 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8c8bf83f-ab19-43ea-b331-db24e4e97709-metrics-certs\") pod \"network-metrics-daemon-4s5kf\" (UID: \"8c8bf83f-ab19-43ea-b331-db24e4e97709\") " pod="openshift-multus/network-metrics-daemon-4s5kf" Oct 11 03:55:47 crc kubenswrapper[4703]: E1011 03:55:47.076257 4703 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 11 03:55:47 crc kubenswrapper[4703]: E1011 03:55:47.076393 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8c8bf83f-ab19-43ea-b331-db24e4e97709-metrics-certs podName:8c8bf83f-ab19-43ea-b331-db24e4e97709 nodeName:}" failed. No retries permitted until 2025-10-11 03:55:49.076358757 +0000 UTC m=+40.286840719 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8c8bf83f-ab19-43ea-b331-db24e4e97709-metrics-certs") pod "network-metrics-daemon-4s5kf" (UID: "8c8bf83f-ab19-43ea-b331-db24e4e97709") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 11 03:55:47 crc kubenswrapper[4703]: I1011 03:55:47.533495 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4s5kf" Oct 11 03:55:47 crc kubenswrapper[4703]: E1011 03:55:47.533721 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4s5kf" podUID="8c8bf83f-ab19-43ea-b331-db24e4e97709" Oct 11 03:55:47 crc kubenswrapper[4703]: I1011 03:55:47.877291 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"3769a4b33a1121dd87a3d7f89ca8a881e3f27c06d17a85020a2de65473383287"} Oct 11 03:55:48 crc kubenswrapper[4703]: I1011 03:55:48.532633 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 11 03:55:48 crc kubenswrapper[4703]: I1011 03:55:48.532666 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 11 03:55:48 crc kubenswrapper[4703]: I1011 03:55:48.532661 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 11 03:55:48 crc kubenswrapper[4703]: E1011 03:55:48.532831 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 11 03:55:48 crc kubenswrapper[4703]: E1011 03:55:48.532953 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 11 03:55:48 crc kubenswrapper[4703]: E1011 03:55:48.533151 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 11 03:55:48 crc kubenswrapper[4703]: I1011 03:55:48.897371 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6khdw" podStartSLOduration=12.897344776 podStartE2EDuration="12.897344776s" podCreationTimestamp="2025-10-11 03:55:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 03:55:46.909856621 +0000 UTC m=+38.120338553" watchObservedRunningTime="2025-10-11 03:55:48.897344776 +0000 UTC m=+40.107826738" Oct 11 03:55:48 crc kubenswrapper[4703]: I1011 03:55:48.962831 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Oct 11 03:55:48 crc kubenswrapper[4703]: I1011 03:55:48.963093 4703 kubelet_node_status.go:538] "Fast updating node status as it just became ready" Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.024644 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-dnf85"] Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.025247 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-dnf85" Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.028228 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.029032 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-fn9dd"] Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.029483 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-fn9dd" Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.030408 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-q458x"] Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.030927 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.030944 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-q458x" Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.031164 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.031259 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.035847 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.036735 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.037284 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.037515 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.037686 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.037879 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.038013 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.038094 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.038385 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.038735 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-xh9xx"] Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.038805 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.039397 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-xh9xx" Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.051063 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.053120 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.053170 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.053432 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-vmhjk"] Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.053900 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.054246 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-vmhjk" Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.055086 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-m5qlx"] Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.055738 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-m5qlx" Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.056405 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-9lb9d"] Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.056916 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-9lb9d" Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.057686 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-pb92d"] Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.058755 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-q8rmz"] Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.058951 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-pb92d" Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.059540 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-842hf"] Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.059693 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-q8rmz" Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.060191 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-842hf" Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.060665 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-rz7sl"] Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.061176 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rz7sl" Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.068432 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-bz6vn"] Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.068775 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.069247 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-q926p"] Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.069664 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-4fv46"] Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.070153 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-nt5pr"] Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.074661 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.075370 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.075426 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.075837 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.076139 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.076221 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.076310 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.076564 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.076304 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.076786 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.077247 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.077532 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.077246 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-bz6vn" Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.077672 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.077725 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-q926p" Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.078088 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4fv46" Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.078554 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.079827 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.080304 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.080522 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-h9vv2"] Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.080793 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.101793 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.102957 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-nt5pr" Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.103272 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.104330 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.105718 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.105780 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-dr9rm"] Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.112791 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.112597 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-h9vv2" Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.113069 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.113367 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.113516 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.106854 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.113675 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-q44w9"] Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.105839 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/990fd4dc-4606-469a-9ced-8f434c2df124-config\") pod \"machine-api-operator-5694c8668f-fn9dd\" (UID: \"990fd4dc-4606-469a-9ced-8f434c2df124\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-fn9dd" Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.113743 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.113787 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/990fd4dc-4606-469a-9ced-8f434c2df124-images\") pod \"machine-api-operator-5694c8668f-fn9dd\" (UID: \"990fd4dc-4606-469a-9ced-8f434c2df124\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-fn9dd" Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.113814 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01868ae6-6122-48a8-bc3a-3cb62cf08ac2-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-q458x\" (UID: \"01868ae6-6122-48a8-bc3a-3cb62cf08ac2\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-q458x" Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.113868 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wf479\" (UniqueName: \"kubernetes.io/projected/f5fe968e-3846-4f87-a7b9-4fdb9d945eab-kube-api-access-wf479\") pod \"downloads-7954f5f757-dnf85\" (UID: \"f5fe968e-3846-4f87-a7b9-4fdb9d945eab\") " pod="openshift-console/downloads-7954f5f757-dnf85" Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.113956 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.113972 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8c8bf83f-ab19-43ea-b331-db24e4e97709-metrics-certs\") pod \"network-metrics-daemon-4s5kf\" (UID: \"8c8bf83f-ab19-43ea-b331-db24e4e97709\") " pod="openshift-multus/network-metrics-daemon-4s5kf" Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.114006 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01868ae6-6122-48a8-bc3a-3cb62cf08ac2-config\") pod \"openshift-apiserver-operator-796bbdcf4f-q458x\" (UID: \"01868ae6-6122-48a8-bc3a-3cb62cf08ac2\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-q458x" Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.114032 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/990fd4dc-4606-469a-9ced-8f434c2df124-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-fn9dd\" (UID: \"990fd4dc-4606-469a-9ced-8f434c2df124\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-fn9dd" Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.114052 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-dr9rm" Oct 11 03:55:49 crc kubenswrapper[4703]: E1011 03:55:49.114204 4703 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 11 03:55:49 crc kubenswrapper[4703]: E1011 03:55:49.114296 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8c8bf83f-ab19-43ea-b331-db24e4e97709-metrics-certs podName:8c8bf83f-ab19-43ea-b331-db24e4e97709 nodeName:}" failed. No retries permitted until 2025-10-11 03:55:53.114269623 +0000 UTC m=+44.324751585 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8c8bf83f-ab19-43ea-b331-db24e4e97709-metrics-certs") pod "network-metrics-daemon-4s5kf" (UID: "8c8bf83f-ab19-43ea-b331-db24e4e97709") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.114368 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.114516 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.114744 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-8whdq"] Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.114066 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gmvsf\" (UniqueName: \"kubernetes.io/projected/01868ae6-6122-48a8-bc3a-3cb62cf08ac2-kube-api-access-gmvsf\") pod \"openshift-apiserver-operator-796bbdcf4f-q458x\" (UID: \"01868ae6-6122-48a8-bc3a-3cb62cf08ac2\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-q458x" Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.114841 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9mjzf\" (UniqueName: \"kubernetes.io/projected/990fd4dc-4606-469a-9ced-8f434c2df124-kube-api-access-9mjzf\") pod \"machine-api-operator-5694c8668f-fn9dd\" (UID: \"990fd4dc-4606-469a-9ced-8f434c2df124\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-fn9dd" Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.114894 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-q44w9" Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.114903 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.115579 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-bcgcx"] Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.115907 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-w79s7"] Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.116090 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-bcgcx" Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.116158 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-8whdq" Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.116965 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-4nf42"] Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.117549 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4nf42" Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.117744 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-w79s7" Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.122071 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.123105 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.123255 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.123297 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.123418 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.123655 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.123873 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.123981 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.124095 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.124135 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.124183 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.124228 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.124259 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.124320 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.124393 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.124486 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.124563 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.124758 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.124912 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.125011 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.124328 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.125239 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.128661 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-bdzmz"] Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.156972 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-6rj5d"] Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.164356 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.164573 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.164790 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.164914 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.165073 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.165220 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.165333 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.165477 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.165736 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.165903 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.166111 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.166945 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.167039 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.167521 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.167583 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.167944 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.178619 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.180450 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-bdzmz" Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.181050 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.181597 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-75knh"] Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.182053 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.182074 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-75knh" Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.182218 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-6rj5d" Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.183511 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-z7wfl"] Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.183931 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-z7wfl" Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.184150 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.184907 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-dfgkf"] Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.185295 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-dfgkf" Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.186233 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-4v67m"] Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.186678 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-4v67m" Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.190517 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.190724 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.193555 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.193750 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.193977 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.194114 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.194851 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.195967 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.196223 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.196675 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.197092 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-hrr65"] Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.197822 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-b5vqb"] Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.198132 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-b5mqn"] Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.198757 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-b5mqn" Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.198783 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.198986 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-hrr65" Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.199108 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-b5vqb" Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.199141 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.199657 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.201246 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-dnf85"] Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.201295 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-8wq6t"] Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.201914 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-8wq6t" Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.202186 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vmsw5"] Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.202751 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vmsw5" Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.204098 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29335905-4jslx"] Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.205534 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-mlb5q"] Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.206263 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-mlb5q" Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.207584 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.209602 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-tjd96"] Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.217214 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29335905-4jslx" Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.219371 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01868ae6-6122-48a8-bc3a-3cb62cf08ac2-config\") pod \"openshift-apiserver-operator-796bbdcf4f-q458x\" (UID: \"01868ae6-6122-48a8-bc3a-3cb62cf08ac2\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-q458x" Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.219417 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/990fd4dc-4606-469a-9ced-8f434c2df124-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-fn9dd\" (UID: \"990fd4dc-4606-469a-9ced-8f434c2df124\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-fn9dd" Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.219449 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gmvsf\" (UniqueName: \"kubernetes.io/projected/01868ae6-6122-48a8-bc3a-3cb62cf08ac2-kube-api-access-gmvsf\") pod \"openshift-apiserver-operator-796bbdcf4f-q458x\" (UID: \"01868ae6-6122-48a8-bc3a-3cb62cf08ac2\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-q458x" Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.219493 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9mjzf\" (UniqueName: \"kubernetes.io/projected/990fd4dc-4606-469a-9ced-8f434c2df124-kube-api-access-9mjzf\") pod \"machine-api-operator-5694c8668f-fn9dd\" (UID: \"990fd4dc-4606-469a-9ced-8f434c2df124\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-fn9dd" Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.219532 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/990fd4dc-4606-469a-9ced-8f434c2df124-config\") pod \"machine-api-operator-5694c8668f-fn9dd\" (UID: \"990fd4dc-4606-469a-9ced-8f434c2df124\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-fn9dd" Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.219552 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/990fd4dc-4606-469a-9ced-8f434c2df124-images\") pod \"machine-api-operator-5694c8668f-fn9dd\" (UID: \"990fd4dc-4606-469a-9ced-8f434c2df124\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-fn9dd" Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.219574 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01868ae6-6122-48a8-bc3a-3cb62cf08ac2-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-q458x\" (UID: \"01868ae6-6122-48a8-bc3a-3cb62cf08ac2\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-q458x" Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.220442 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01868ae6-6122-48a8-bc3a-3cb62cf08ac2-config\") pod \"openshift-apiserver-operator-796bbdcf4f-q458x\" (UID: \"01868ae6-6122-48a8-bc3a-3cb62cf08ac2\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-q458x" Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.221331 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-tjd96" Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.221492 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/990fd4dc-4606-469a-9ced-8f434c2df124-images\") pod \"machine-api-operator-5694c8668f-fn9dd\" (UID: \"990fd4dc-4606-469a-9ced-8f434c2df124\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-fn9dd" Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.221701 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/990fd4dc-4606-469a-9ced-8f434c2df124-config\") pod \"machine-api-operator-5694c8668f-fn9dd\" (UID: \"990fd4dc-4606-469a-9ced-8f434c2df124\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-fn9dd" Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.229055 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wf479\" (UniqueName: \"kubernetes.io/projected/f5fe968e-3846-4f87-a7b9-4fdb9d945eab-kube-api-access-wf479\") pod \"downloads-7954f5f757-dnf85\" (UID: \"f5fe968e-3846-4f87-a7b9-4fdb9d945eab\") " pod="openshift-console/downloads-7954f5f757-dnf85" Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.229163 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mk657\" (UniqueName: \"kubernetes.io/projected/99ddef85-eb82-4975-aaad-83f550825f30-kube-api-access-mk657\") pod \"migrator-59844c95c7-nt5pr\" (UID: \"99ddef85-eb82-4975-aaad-83f550825f30\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-nt5pr" Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.229413 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01868ae6-6122-48a8-bc3a-3cb62cf08ac2-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-q458x\" (UID: \"01868ae6-6122-48a8-bc3a-3cb62cf08ac2\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-q458x" Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.230873 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.230946 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/990fd4dc-4606-469a-9ced-8f434c2df124-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-fn9dd\" (UID: \"990fd4dc-4606-469a-9ced-8f434c2df124\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-fn9dd" Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.231127 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-q458x"] Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.231405 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.233406 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-xh9xx"] Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.235254 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-6588z"] Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.236906 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-6588z" Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.240448 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-8wrsq"] Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.241217 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-8wrsq" Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.246086 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-fn9dd"] Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.246501 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-psx9s"] Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.247024 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-psx9s" Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.247279 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-2jlb9"] Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.248076 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.248626 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-2jlb9" Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.249041 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-vmhjk"] Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.250071 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-9lb9d"] Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.250456 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-pb92d"] Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.251294 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-69n87"] Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.251843 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-69n87" Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.252944 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-q8rmz"] Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.254647 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-h9vv2"] Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.256300 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-m5qlx"] Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.260173 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-4fv46"] Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.262781 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-q926p"] Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.264657 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-8whdq"] Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.265631 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-rz7sl"] Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.266698 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-nt5pr"] Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.266905 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.267884 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-dr9rm"] Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.268845 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-bcgcx"] Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.269840 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-bz6vn"] Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.271053 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-q44w9"] Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.272440 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-4nf42"] Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.272932 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-rf7jf"] Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.273897 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-rf7jf" Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.274151 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-728m8"] Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.275117 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-4v67m"] Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.275183 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-728m8" Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.275804 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vmsw5"] Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.276770 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-bdzmz"] Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.277913 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-z7wfl"] Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.278803 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-8wq6t"] Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.279734 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-6rj5d"] Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.280655 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-dfgkf"] Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.281817 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-rf7jf"] Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.283029 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-75knh"] Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.284047 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-w79s7"] Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.285278 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-hrr65"] Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.286510 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-2jlb9"] Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.287458 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29335905-4jslx"] Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.288044 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.288657 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-b5mqn"] Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.289609 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-728m8"] Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.291015 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-8wrsq"] Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.291692 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-tjd96"] Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.301193 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-b5vqb"] Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.303605 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-psx9s"] Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.315441 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.327644 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.330564 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mk657\" (UniqueName: \"kubernetes.io/projected/99ddef85-eb82-4975-aaad-83f550825f30-kube-api-access-mk657\") pod \"migrator-59844c95c7-nt5pr\" (UID: \"99ddef85-eb82-4975-aaad-83f550825f30\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-nt5pr" Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.347887 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.367789 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.387443 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.407872 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.428297 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.447351 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.468676 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.487423 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.509539 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.533608 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4s5kf" Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.535876 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.547322 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.567994 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.587949 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.607909 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.668077 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.687273 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.710766 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.731177 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.747754 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.767491 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.787392 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.807590 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.827124 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.847660 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.867811 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.888958 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.909295 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.929014 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.948226 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.968112 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Oct 11 03:55:49 crc kubenswrapper[4703]: I1011 03:55:49.988562 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Oct 11 03:55:50 crc kubenswrapper[4703]: I1011 03:55:50.008525 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Oct 11 03:55:50 crc kubenswrapper[4703]: I1011 03:55:50.028858 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Oct 11 03:55:50 crc kubenswrapper[4703]: I1011 03:55:50.048031 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Oct 11 03:55:50 crc kubenswrapper[4703]: I1011 03:55:50.068027 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Oct 11 03:55:50 crc kubenswrapper[4703]: I1011 03:55:50.088029 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Oct 11 03:55:50 crc kubenswrapper[4703]: I1011 03:55:50.107831 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Oct 11 03:55:50 crc kubenswrapper[4703]: I1011 03:55:50.127282 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Oct 11 03:55:50 crc kubenswrapper[4703]: I1011 03:55:50.148391 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Oct 11 03:55:50 crc kubenswrapper[4703]: I1011 03:55:50.169341 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Oct 11 03:55:50 crc kubenswrapper[4703]: I1011 03:55:50.188055 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Oct 11 03:55:50 crc kubenswrapper[4703]: I1011 03:55:50.210545 4703 request.go:700] Waited for 1.011164471s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-storage-version-migrator-operator/secrets?fieldSelector=metadata.name%3Dserving-cert&limit=500&resourceVersion=0 Oct 11 03:55:50 crc kubenswrapper[4703]: I1011 03:55:50.214968 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Oct 11 03:55:50 crc kubenswrapper[4703]: I1011 03:55:50.228323 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Oct 11 03:55:50 crc kubenswrapper[4703]: I1011 03:55:50.248398 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Oct 11 03:55:50 crc kubenswrapper[4703]: I1011 03:55:50.268977 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Oct 11 03:55:50 crc kubenswrapper[4703]: I1011 03:55:50.288000 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Oct 11 03:55:50 crc kubenswrapper[4703]: I1011 03:55:50.307758 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Oct 11 03:55:50 crc kubenswrapper[4703]: I1011 03:55:50.328617 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Oct 11 03:55:50 crc kubenswrapper[4703]: I1011 03:55:50.348104 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Oct 11 03:55:50 crc kubenswrapper[4703]: I1011 03:55:50.368139 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Oct 11 03:55:50 crc kubenswrapper[4703]: I1011 03:55:50.387817 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Oct 11 03:55:50 crc kubenswrapper[4703]: I1011 03:55:50.407786 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Oct 11 03:55:50 crc kubenswrapper[4703]: I1011 03:55:50.428850 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 11 03:55:50 crc kubenswrapper[4703]: I1011 03:55:50.448946 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Oct 11 03:55:50 crc kubenswrapper[4703]: I1011 03:55:50.468406 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Oct 11 03:55:50 crc kubenswrapper[4703]: I1011 03:55:50.488357 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Oct 11 03:55:50 crc kubenswrapper[4703]: I1011 03:55:50.508652 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Oct 11 03:55:50 crc kubenswrapper[4703]: I1011 03:55:50.528076 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Oct 11 03:55:50 crc kubenswrapper[4703]: I1011 03:55:50.533499 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 11 03:55:50 crc kubenswrapper[4703]: I1011 03:55:50.533538 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 11 03:55:50 crc kubenswrapper[4703]: I1011 03:55:50.533825 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 11 03:55:50 crc kubenswrapper[4703]: I1011 03:55:50.547834 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 11 03:55:50 crc kubenswrapper[4703]: I1011 03:55:50.598437 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gmvsf\" (UniqueName: \"kubernetes.io/projected/01868ae6-6122-48a8-bc3a-3cb62cf08ac2-kube-api-access-gmvsf\") pod \"openshift-apiserver-operator-796bbdcf4f-q458x\" (UID: \"01868ae6-6122-48a8-bc3a-3cb62cf08ac2\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-q458x" Oct 11 03:55:50 crc kubenswrapper[4703]: I1011 03:55:50.608666 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Oct 11 03:55:50 crc kubenswrapper[4703]: I1011 03:55:50.614606 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-q458x" Oct 11 03:55:50 crc kubenswrapper[4703]: I1011 03:55:50.622857 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9mjzf\" (UniqueName: \"kubernetes.io/projected/990fd4dc-4606-469a-9ced-8f434c2df124-kube-api-access-9mjzf\") pod \"machine-api-operator-5694c8668f-fn9dd\" (UID: \"990fd4dc-4606-469a-9ced-8f434c2df124\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-fn9dd" Oct 11 03:55:50 crc kubenswrapper[4703]: I1011 03:55:50.637043 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Oct 11 03:55:50 crc kubenswrapper[4703]: I1011 03:55:50.647846 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Oct 11 03:55:50 crc kubenswrapper[4703]: I1011 03:55:50.668909 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Oct 11 03:55:50 crc kubenswrapper[4703]: I1011 03:55:50.688327 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Oct 11 03:55:50 crc kubenswrapper[4703]: I1011 03:55:50.709786 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Oct 11 03:55:50 crc kubenswrapper[4703]: I1011 03:55:50.748525 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Oct 11 03:55:50 crc kubenswrapper[4703]: I1011 03:55:50.755738 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wf479\" (UniqueName: \"kubernetes.io/projected/f5fe968e-3846-4f87-a7b9-4fdb9d945eab-kube-api-access-wf479\") pod \"downloads-7954f5f757-dnf85\" (UID: \"f5fe968e-3846-4f87-a7b9-4fdb9d945eab\") " pod="openshift-console/downloads-7954f5f757-dnf85" Oct 11 03:55:50 crc kubenswrapper[4703]: I1011 03:55:50.767722 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Oct 11 03:55:50 crc kubenswrapper[4703]: I1011 03:55:50.787425 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Oct 11 03:55:50 crc kubenswrapper[4703]: I1011 03:55:50.808944 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Oct 11 03:55:50 crc kubenswrapper[4703]: I1011 03:55:50.827679 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Oct 11 03:55:50 crc kubenswrapper[4703]: I1011 03:55:50.849112 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-q458x"] Oct 11 03:55:50 crc kubenswrapper[4703]: I1011 03:55:50.850126 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Oct 11 03:55:50 crc kubenswrapper[4703]: I1011 03:55:50.853779 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-dnf85" Oct 11 03:55:50 crc kubenswrapper[4703]: W1011 03:55:50.856913 4703 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod01868ae6_6122_48a8_bc3a_3cb62cf08ac2.slice/crio-532f440125fa6eb5c7819b6d7a2572f68ad39fd0f0fe29d71a9bc685089f0a0d WatchSource:0}: Error finding container 532f440125fa6eb5c7819b6d7a2572f68ad39fd0f0fe29d71a9bc685089f0a0d: Status 404 returned error can't find the container with id 532f440125fa6eb5c7819b6d7a2572f68ad39fd0f0fe29d71a9bc685089f0a0d Oct 11 03:55:50 crc kubenswrapper[4703]: I1011 03:55:50.868058 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Oct 11 03:55:50 crc kubenswrapper[4703]: I1011 03:55:50.879106 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-fn9dd" Oct 11 03:55:50 crc kubenswrapper[4703]: I1011 03:55:50.888396 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Oct 11 03:55:50 crc kubenswrapper[4703]: I1011 03:55:50.894240 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-q458x" event={"ID":"01868ae6-6122-48a8-bc3a-3cb62cf08ac2","Type":"ContainerStarted","Data":"532f440125fa6eb5c7819b6d7a2572f68ad39fd0f0fe29d71a9bc685089f0a0d"} Oct 11 03:55:50 crc kubenswrapper[4703]: I1011 03:55:50.909226 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Oct 11 03:55:50 crc kubenswrapper[4703]: I1011 03:55:50.928045 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Oct 11 03:55:50 crc kubenswrapper[4703]: I1011 03:55:50.948975 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Oct 11 03:55:50 crc kubenswrapper[4703]: I1011 03:55:50.968265 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Oct 11 03:55:50 crc kubenswrapper[4703]: I1011 03:55:50.988484 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.007303 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.028390 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-sysctl-allowlist" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.048149 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.067311 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.087306 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.097620 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-fn9dd"] Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.108151 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.127148 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.137953 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-dnf85"] Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.147588 4703 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.182246 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mk657\" (UniqueName: \"kubernetes.io/projected/99ddef85-eb82-4975-aaad-83f550825f30-kube-api-access-mk657\") pod \"migrator-59844c95c7-nt5pr\" (UID: \"99ddef85-eb82-4975-aaad-83f550825f30\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-nt5pr" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.187960 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.207957 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.227360 4703 request.go:700] Waited for 1.593392789s due to client-side throttling, not priority and fairness, request: PATCH:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-cluster-machine-approver/pods/machine-approver-56656f9798-842hf/status Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.254378 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5878306c-90e4-4924-b23c-90079c35d2dc-metrics-tls\") pod \"ingress-operator-5b745b69d9-4nf42\" (UID: \"5878306c-90e4-4924-b23c-90079c35d2dc\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4nf42" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.254414 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/34cb015e-a7ba-4cea-a2fc-ab3b101299ce-console-serving-cert\") pod \"console-f9d7485db-m5qlx\" (UID: \"34cb015e-a7ba-4cea-a2fc-ab3b101299ce\") " pod="openshift-console/console-f9d7485db-m5qlx" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.254434 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cffc1277-9912-47dd-80a3-d724149e420c-serving-cert\") pod \"apiserver-7bbb656c7d-4fv46\" (UID: \"cffc1277-9912-47dd-80a3-d724149e420c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4fv46" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.254450 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-64v9p\" (UniqueName: \"kubernetes.io/projected/cffc1277-9912-47dd-80a3-d724149e420c-kube-api-access-64v9p\") pod \"apiserver-7bbb656c7d-4fv46\" (UID: \"cffc1277-9912-47dd-80a3-d724149e420c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4fv46" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.254491 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c456c157-fdbc-483e-b2ca-ceb0d2b6247b-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-q926p\" (UID: \"c456c157-fdbc-483e-b2ca-ceb0d2b6247b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-q926p" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.254507 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ff55314a-7f2f-42c9-9723-2fdaad791959-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-9lb9d\" (UID: \"ff55314a-7f2f-42c9-9723-2fdaad791959\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9lb9d" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.254521 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c95nl\" (UniqueName: \"kubernetes.io/projected/0b2f916a-1f36-4ff0-96df-135bcfc2d5f8-kube-api-access-c95nl\") pod \"catalog-operator-68c6474976-dr9rm\" (UID: \"0b2f916a-1f36-4ff0-96df-135bcfc2d5f8\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-dr9rm" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.254557 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4e1fed36-f837-4e3d-86fa-a4f05ef0a7e9-config\") pod \"machine-approver-56656f9798-842hf\" (UID: \"4e1fed36-f837-4e3d-86fa-a4f05ef0a7e9\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-842hf" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.254597 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/be7d6fb4-1943-4d7f-87a8-ce906ed31cf0-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-bz6vn\" (UID: \"be7d6fb4-1943-4d7f-87a8-ce906ed31cf0\") " pod="openshift-authentication/oauth-openshift-558db77b4-bz6vn" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.254819 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/92095718-20ba-4b03-b949-e9a26009e283-registry-certificates\") pod \"image-registry-697d97f7c8-q44w9\" (UID: \"92095718-20ba-4b03-b949-e9a26009e283\") " pod="openshift-image-registry/image-registry-697d97f7c8-q44w9" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.254864 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a9831579-a012-4c7d-a48c-2c0eee55cc36-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-8whdq\" (UID: \"a9831579-a012-4c7d-a48c-2c0eee55cc36\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-8whdq" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.254900 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/be7d6fb4-1943-4d7f-87a8-ce906ed31cf0-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-bz6vn\" (UID: \"be7d6fb4-1943-4d7f-87a8-ce906ed31cf0\") " pod="openshift-authentication/oauth-openshift-558db77b4-bz6vn" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.254949 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/0b2f916a-1f36-4ff0-96df-135bcfc2d5f8-srv-cert\") pod \"catalog-operator-68c6474976-dr9rm\" (UID: \"0b2f916a-1f36-4ff0-96df-135bcfc2d5f8\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-dr9rm" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.255015 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/0b2f916a-1f36-4ff0-96df-135bcfc2d5f8-profile-collector-cert\") pod \"catalog-operator-68c6474976-dr9rm\" (UID: \"0b2f916a-1f36-4ff0-96df-135bcfc2d5f8\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-dr9rm" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.255069 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b70b31eb-da80-4680-988d-7dd5d8ab1fe6-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-bdzmz\" (UID: \"b70b31eb-da80-4680-988d-7dd5d8ab1fe6\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-bdzmz" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.255100 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7580d2bf-4b66-4dfb-9693-ebcc64225c58-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-bcgcx\" (UID: \"7580d2bf-4b66-4dfb-9693-ebcc64225c58\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-bcgcx" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.255197 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-88q5n\" (UniqueName: \"kubernetes.io/projected/c456c157-fdbc-483e-b2ca-ceb0d2b6247b-kube-api-access-88q5n\") pod \"cluster-image-registry-operator-dc59b4c8b-q926p\" (UID: \"c456c157-fdbc-483e-b2ca-ceb0d2b6247b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-q926p" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.255402 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/9b135157-3c40-4f85-84a4-5c521ccbf1bd-etcd-serving-ca\") pod \"apiserver-76f77b778f-pb92d\" (UID: \"9b135157-3c40-4f85-84a4-5c521ccbf1bd\") " pod="openshift-apiserver/apiserver-76f77b778f-pb92d" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.255607 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b135157-3c40-4f85-84a4-5c521ccbf1bd-config\") pod \"apiserver-76f77b778f-pb92d\" (UID: \"9b135157-3c40-4f85-84a4-5c521ccbf1bd\") " pod="openshift-apiserver/apiserver-76f77b778f-pb92d" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.255645 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/9b135157-3c40-4f85-84a4-5c521ccbf1bd-audit\") pod \"apiserver-76f77b778f-pb92d\" (UID: \"9b135157-3c40-4f85-84a4-5c521ccbf1bd\") " pod="openshift-apiserver/apiserver-76f77b778f-pb92d" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.255707 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9b135157-3c40-4f85-84a4-5c521ccbf1bd-serving-cert\") pod \"apiserver-76f77b778f-pb92d\" (UID: \"9b135157-3c40-4f85-84a4-5c521ccbf1bd\") " pod="openshift-apiserver/apiserver-76f77b778f-pb92d" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.255743 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/be7d6fb4-1943-4d7f-87a8-ce906ed31cf0-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-bz6vn\" (UID: \"be7d6fb4-1943-4d7f-87a8-ce906ed31cf0\") " pod="openshift-authentication/oauth-openshift-558db77b4-bz6vn" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.255813 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ff55314a-7f2f-42c9-9723-2fdaad791959-client-ca\") pod \"controller-manager-879f6c89f-9lb9d\" (UID: \"ff55314a-7f2f-42c9-9723-2fdaad791959\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9lb9d" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.255850 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/34cb015e-a7ba-4cea-a2fc-ab3b101299ce-console-config\") pod \"console-f9d7485db-m5qlx\" (UID: \"34cb015e-a7ba-4cea-a2fc-ab3b101299ce\") " pod="openshift-console/console-f9d7485db-m5qlx" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.255886 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/4e1fed36-f837-4e3d-86fa-a4f05ef0a7e9-machine-approver-tls\") pod \"machine-approver-56656f9798-842hf\" (UID: \"4e1fed36-f837-4e3d-86fa-a4f05ef0a7e9\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-842hf" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.255921 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9425edee-98a4-4086-8d31-28478469501b-config\") pod \"route-controller-manager-6576b87f9c-rz7sl\" (UID: \"9425edee-98a4-4086-8d31-28478469501b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rz7sl" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.255955 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0a9d5e17-f5b5-4f0e-81e2-4edbe41c4e9f-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-w79s7\" (UID: \"0a9d5e17-f5b5-4f0e-81e2-4edbe41c4e9f\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-w79s7" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.255988 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/be037531-a087-4b3e-9327-f28d38fd3961-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-h9vv2\" (UID: \"be037531-a087-4b3e-9327-f28d38fd3961\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-h9vv2" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.256019 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7cs8d\" (UniqueName: \"kubernetes.io/projected/34cb015e-a7ba-4cea-a2fc-ab3b101299ce-kube-api-access-7cs8d\") pod \"console-f9d7485db-m5qlx\" (UID: \"34cb015e-a7ba-4cea-a2fc-ab3b101299ce\") " pod="openshift-console/console-f9d7485db-m5qlx" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.256091 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/9b135157-3c40-4f85-84a4-5c521ccbf1bd-etcd-client\") pod \"apiserver-76f77b778f-pb92d\" (UID: \"9b135157-3c40-4f85-84a4-5c521ccbf1bd\") " pod="openshift-apiserver/apiserver-76f77b778f-pb92d" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.256123 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9b135157-3c40-4f85-84a4-5c521ccbf1bd-trusted-ca-bundle\") pod \"apiserver-76f77b778f-pb92d\" (UID: \"9b135157-3c40-4f85-84a4-5c521ccbf1bd\") " pod="openshift-apiserver/apiserver-76f77b778f-pb92d" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.256157 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8fwp8\" (UniqueName: \"kubernetes.io/projected/be037531-a087-4b3e-9327-f28d38fd3961-kube-api-access-8fwp8\") pod \"cluster-samples-operator-665b6dd947-h9vv2\" (UID: \"be037531-a087-4b3e-9327-f28d38fd3961\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-h9vv2" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.256189 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/34cb015e-a7ba-4cea-a2fc-ab3b101299ce-console-oauth-config\") pod \"console-f9d7485db-m5qlx\" (UID: \"34cb015e-a7ba-4cea-a2fc-ab3b101299ce\") " pod="openshift-console/console-f9d7485db-m5qlx" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.256218 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9425edee-98a4-4086-8d31-28478469501b-serving-cert\") pod \"route-controller-manager-6576b87f9c-rz7sl\" (UID: \"9425edee-98a4-4086-8d31-28478469501b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rz7sl" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.256252 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b7f45d1c-1e38-42a4-a4e6-b682787d73cd-config\") pod \"console-operator-58897d9998-vmhjk\" (UID: \"b7f45d1c-1e38-42a4-a4e6-b682787d73cd\") " pod="openshift-console-operator/console-operator-58897d9998-vmhjk" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.256285 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/cffc1277-9912-47dd-80a3-d724149e420c-audit-dir\") pod \"apiserver-7bbb656c7d-4fv46\" (UID: \"cffc1277-9912-47dd-80a3-d724149e420c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4fv46" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.256341 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q44w9\" (UID: \"92095718-20ba-4b03-b949-e9a26009e283\") " pod="openshift-image-registry/image-registry-697d97f7c8-q44w9" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.256378 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-79f5m\" (UniqueName: \"kubernetes.io/projected/b7f45d1c-1e38-42a4-a4e6-b682787d73cd-kube-api-access-79f5m\") pod \"console-operator-58897d9998-vmhjk\" (UID: \"b7f45d1c-1e38-42a4-a4e6-b682787d73cd\") " pod="openshift-console-operator/console-operator-58897d9998-vmhjk" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.256411 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/34cb015e-a7ba-4cea-a2fc-ab3b101299ce-trusted-ca-bundle\") pod \"console-f9d7485db-m5qlx\" (UID: \"34cb015e-a7ba-4cea-a2fc-ab3b101299ce\") " pod="openshift-console/console-f9d7485db-m5qlx" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.256520 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/9b135157-3c40-4f85-84a4-5c521ccbf1bd-node-pullsecrets\") pod \"apiserver-76f77b778f-pb92d\" (UID: \"9b135157-3c40-4f85-84a4-5c521ccbf1bd\") " pod="openshift-apiserver/apiserver-76f77b778f-pb92d" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.256558 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/62aed52a-c801-4ec0-b96b-6132de0e4200-serving-cert\") pod \"authentication-operator-69f744f599-q8rmz\" (UID: \"62aed52a-c801-4ec0-b96b-6132de0e4200\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-q8rmz" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.256590 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3e81d771-d34a-4a73-b0fd-0ea7e72f9e25-serving-cert\") pod \"openshift-config-operator-7777fb866f-xh9xx\" (UID: \"3e81d771-d34a-4a73-b0fd-0ea7e72f9e25\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-xh9xx" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.257020 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4ffhd\" (UniqueName: \"kubernetes.io/projected/9425edee-98a4-4086-8d31-28478469501b-kube-api-access-4ffhd\") pod \"route-controller-manager-6576b87f9c-rz7sl\" (UID: \"9425edee-98a4-4086-8d31-28478469501b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rz7sl" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.257069 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/9b135157-3c40-4f85-84a4-5c521ccbf1bd-encryption-config\") pod \"apiserver-76f77b778f-pb92d\" (UID: \"9b135157-3c40-4f85-84a4-5c521ccbf1bd\") " pod="openshift-apiserver/apiserver-76f77b778f-pb92d" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.257097 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/be7d6fb4-1943-4d7f-87a8-ce906ed31cf0-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-bz6vn\" (UID: \"be7d6fb4-1943-4d7f-87a8-ce906ed31cf0\") " pod="openshift-authentication/oauth-openshift-558db77b4-bz6vn" Oct 11 03:55:51 crc kubenswrapper[4703]: E1011 03:55:51.257117 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-11 03:55:51.757094168 +0000 UTC m=+42.967576130 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q44w9" (UID: "92095718-20ba-4b03-b949-e9a26009e283") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.257162 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/92095718-20ba-4b03-b949-e9a26009e283-registry-tls\") pod \"image-registry-697d97f7c8-q44w9\" (UID: \"92095718-20ba-4b03-b949-e9a26009e283\") " pod="openshift-image-registry/image-registry-697d97f7c8-q44w9" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.257204 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cffc1277-9912-47dd-80a3-d724149e420c-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-4fv46\" (UID: \"cffc1277-9912-47dd-80a3-d724149e420c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4fv46" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.257239 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vnqrh\" (UniqueName: \"kubernetes.io/projected/9b135157-3c40-4f85-84a4-5c521ccbf1bd-kube-api-access-vnqrh\") pod \"apiserver-76f77b778f-pb92d\" (UID: \"9b135157-3c40-4f85-84a4-5c521ccbf1bd\") " pod="openshift-apiserver/apiserver-76f77b778f-pb92d" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.257273 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t8qr8\" (UniqueName: \"kubernetes.io/projected/ff55314a-7f2f-42c9-9723-2fdaad791959-kube-api-access-t8qr8\") pod \"controller-manager-879f6c89f-9lb9d\" (UID: \"ff55314a-7f2f-42c9-9723-2fdaad791959\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9lb9d" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.257328 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pktv9\" (UniqueName: \"kubernetes.io/projected/62aed52a-c801-4ec0-b96b-6132de0e4200-kube-api-access-pktv9\") pod \"authentication-operator-69f744f599-q8rmz\" (UID: \"62aed52a-c801-4ec0-b96b-6132de0e4200\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-q8rmz" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.257363 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a9831579-a012-4c7d-a48c-2c0eee55cc36-config\") pod \"kube-apiserver-operator-766d6c64bb-8whdq\" (UID: \"a9831579-a012-4c7d-a48c-2c0eee55cc36\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-8whdq" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.257394 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/cffc1277-9912-47dd-80a3-d724149e420c-audit-policies\") pod \"apiserver-7bbb656c7d-4fv46\" (UID: \"cffc1277-9912-47dd-80a3-d724149e420c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4fv46" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.257500 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sdbzq\" (UniqueName: \"kubernetes.io/projected/e25f5a36-9626-41eb-82b0-6f33dfaf3001-kube-api-access-sdbzq\") pod \"dns-operator-744455d44c-6rj5d\" (UID: \"e25f5a36-9626-41eb-82b0-6f33dfaf3001\") " pod="openshift-dns-operator/dns-operator-744455d44c-6rj5d" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.257550 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9425edee-98a4-4086-8d31-28478469501b-client-ca\") pod \"route-controller-manager-6576b87f9c-rz7sl\" (UID: \"9425edee-98a4-4086-8d31-28478469501b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rz7sl" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.258459 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/be7d6fb4-1943-4d7f-87a8-ce906ed31cf0-audit-dir\") pod \"oauth-openshift-558db77b4-bz6vn\" (UID: \"be7d6fb4-1943-4d7f-87a8-ce906ed31cf0\") " pod="openshift-authentication/oauth-openshift-558db77b4-bz6vn" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.258600 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7tg67\" (UniqueName: \"kubernetes.io/projected/3e81d771-d34a-4a73-b0fd-0ea7e72f9e25-kube-api-access-7tg67\") pod \"openshift-config-operator-7777fb866f-xh9xx\" (UID: \"3e81d771-d34a-4a73-b0fd-0ea7e72f9e25\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-xh9xx" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.258642 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0a9d5e17-f5b5-4f0e-81e2-4edbe41c4e9f-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-w79s7\" (UID: \"0a9d5e17-f5b5-4f0e-81e2-4edbe41c4e9f\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-w79s7" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.258675 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/92095718-20ba-4b03-b949-e9a26009e283-ca-trust-extracted\") pod \"image-registry-697d97f7c8-q44w9\" (UID: \"92095718-20ba-4b03-b949-e9a26009e283\") " pod="openshift-image-registry/image-registry-697d97f7c8-q44w9" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.258708 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-98w9g\" (UniqueName: \"kubernetes.io/projected/92095718-20ba-4b03-b949-e9a26009e283-kube-api-access-98w9g\") pod \"image-registry-697d97f7c8-q44w9\" (UID: \"92095718-20ba-4b03-b949-e9a26009e283\") " pod="openshift-image-registry/image-registry-697d97f7c8-q44w9" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.258743 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c456c157-fdbc-483e-b2ca-ceb0d2b6247b-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-q926p\" (UID: \"c456c157-fdbc-483e-b2ca-ceb0d2b6247b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-q926p" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.258819 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0a9d5e17-f5b5-4f0e-81e2-4edbe41c4e9f-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-w79s7\" (UID: \"0a9d5e17-f5b5-4f0e-81e2-4edbe41c4e9f\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-w79s7" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.258855 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/9b135157-3c40-4f85-84a4-5c521ccbf1bd-audit-dir\") pod \"apiserver-76f77b778f-pb92d\" (UID: \"9b135157-3c40-4f85-84a4-5c521ccbf1bd\") " pod="openshift-apiserver/apiserver-76f77b778f-pb92d" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.258886 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5878306c-90e4-4924-b23c-90079c35d2dc-trusted-ca\") pod \"ingress-operator-5b745b69d9-4nf42\" (UID: \"5878306c-90e4-4924-b23c-90079c35d2dc\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4nf42" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.258918 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/be7d6fb4-1943-4d7f-87a8-ce906ed31cf0-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-bz6vn\" (UID: \"be7d6fb4-1943-4d7f-87a8-ce906ed31cf0\") " pod="openshift-authentication/oauth-openshift-558db77b4-bz6vn" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.258972 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/62aed52a-c801-4ec0-b96b-6132de0e4200-service-ca-bundle\") pod \"authentication-operator-69f744f599-q8rmz\" (UID: \"62aed52a-c801-4ec0-b96b-6132de0e4200\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-q8rmz" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.259005 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/be7d6fb4-1943-4d7f-87a8-ce906ed31cf0-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-bz6vn\" (UID: \"be7d6fb4-1943-4d7f-87a8-ce906ed31cf0\") " pod="openshift-authentication/oauth-openshift-558db77b4-bz6vn" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.259053 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/92095718-20ba-4b03-b949-e9a26009e283-bound-sa-token\") pod \"image-registry-697d97f7c8-q44w9\" (UID: \"92095718-20ba-4b03-b949-e9a26009e283\") " pod="openshift-image-registry/image-registry-697d97f7c8-q44w9" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.259086 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/cffc1277-9912-47dd-80a3-d724149e420c-encryption-config\") pod \"apiserver-7bbb656c7d-4fv46\" (UID: \"cffc1277-9912-47dd-80a3-d724149e420c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4fv46" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.259120 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/3e81d771-d34a-4a73-b0fd-0ea7e72f9e25-available-featuregates\") pod \"openshift-config-operator-7777fb866f-xh9xx\" (UID: \"3e81d771-d34a-4a73-b0fd-0ea7e72f9e25\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-xh9xx" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.259155 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/92095718-20ba-4b03-b949-e9a26009e283-trusted-ca\") pod \"image-registry-697d97f7c8-q44w9\" (UID: \"92095718-20ba-4b03-b949-e9a26009e283\") " pod="openshift-image-registry/image-registry-697d97f7c8-q44w9" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.259191 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b70b31eb-da80-4680-988d-7dd5d8ab1fe6-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-bdzmz\" (UID: \"b70b31eb-da80-4680-988d-7dd5d8ab1fe6\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-bdzmz" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.259226 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g7b9c\" (UniqueName: \"kubernetes.io/projected/b70b31eb-da80-4680-988d-7dd5d8ab1fe6-kube-api-access-g7b9c\") pod \"openshift-controller-manager-operator-756b6f6bc6-bdzmz\" (UID: \"b70b31eb-da80-4680-988d-7dd5d8ab1fe6\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-bdzmz" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.259293 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ff55314a-7f2f-42c9-9723-2fdaad791959-config\") pod \"controller-manager-879f6c89f-9lb9d\" (UID: \"ff55314a-7f2f-42c9-9723-2fdaad791959\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9lb9d" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.259325 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ff55314a-7f2f-42c9-9723-2fdaad791959-serving-cert\") pod \"controller-manager-879f6c89f-9lb9d\" (UID: \"ff55314a-7f2f-42c9-9723-2fdaad791959\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9lb9d" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.259355 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a9831579-a012-4c7d-a48c-2c0eee55cc36-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-8whdq\" (UID: \"a9831579-a012-4c7d-a48c-2c0eee55cc36\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-8whdq" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.259388 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/62aed52a-c801-4ec0-b96b-6132de0e4200-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-q8rmz\" (UID: \"62aed52a-c801-4ec0-b96b-6132de0e4200\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-q8rmz" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.259771 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/92095718-20ba-4b03-b949-e9a26009e283-installation-pull-secrets\") pod \"image-registry-697d97f7c8-q44w9\" (UID: \"92095718-20ba-4b03-b949-e9a26009e283\") " pod="openshift-image-registry/image-registry-697d97f7c8-q44w9" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.259840 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/c456c157-fdbc-483e-b2ca-ceb0d2b6247b-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-q926p\" (UID: \"c456c157-fdbc-483e-b2ca-ceb0d2b6247b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-q926p" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.259887 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e25f5a36-9626-41eb-82b0-6f33dfaf3001-metrics-tls\") pod \"dns-operator-744455d44c-6rj5d\" (UID: \"e25f5a36-9626-41eb-82b0-6f33dfaf3001\") " pod="openshift-dns-operator/dns-operator-744455d44c-6rj5d" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.259914 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/be7d6fb4-1943-4d7f-87a8-ce906ed31cf0-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-bz6vn\" (UID: \"be7d6fb4-1943-4d7f-87a8-ce906ed31cf0\") " pod="openshift-authentication/oauth-openshift-558db77b4-bz6vn" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.259990 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f62c5\" (UniqueName: \"kubernetes.io/projected/5878306c-90e4-4924-b23c-90079c35d2dc-kube-api-access-f62c5\") pod \"ingress-operator-5b745b69d9-4nf42\" (UID: \"5878306c-90e4-4924-b23c-90079c35d2dc\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4nf42" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.260020 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/34cb015e-a7ba-4cea-a2fc-ab3b101299ce-oauth-serving-cert\") pod \"console-f9d7485db-m5qlx\" (UID: \"34cb015e-a7ba-4cea-a2fc-ab3b101299ce\") " pod="openshift-console/console-f9d7485db-m5qlx" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.260042 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/cffc1277-9912-47dd-80a3-d724149e420c-etcd-client\") pod \"apiserver-7bbb656c7d-4fv46\" (UID: \"cffc1277-9912-47dd-80a3-d724149e420c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4fv46" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.260118 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/be7d6fb4-1943-4d7f-87a8-ce906ed31cf0-audit-policies\") pod \"oauth-openshift-558db77b4-bz6vn\" (UID: \"be7d6fb4-1943-4d7f-87a8-ce906ed31cf0\") " pod="openshift-authentication/oauth-openshift-558db77b4-bz6vn" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.260217 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b7f45d1c-1e38-42a4-a4e6-b682787d73cd-trusted-ca\") pod \"console-operator-58897d9998-vmhjk\" (UID: \"b7f45d1c-1e38-42a4-a4e6-b682787d73cd\") " pod="openshift-console-operator/console-operator-58897d9998-vmhjk" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.260305 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/62aed52a-c801-4ec0-b96b-6132de0e4200-config\") pod \"authentication-operator-69f744f599-q8rmz\" (UID: \"62aed52a-c801-4ec0-b96b-6132de0e4200\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-q8rmz" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.260355 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7580d2bf-4b66-4dfb-9693-ebcc64225c58-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-bcgcx\" (UID: \"7580d2bf-4b66-4dfb-9693-ebcc64225c58\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-bcgcx" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.260422 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7580d2bf-4b66-4dfb-9693-ebcc64225c58-config\") pod \"kube-controller-manager-operator-78b949d7b-bcgcx\" (UID: \"7580d2bf-4b66-4dfb-9693-ebcc64225c58\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-bcgcx" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.260525 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vgf4w\" (UniqueName: \"kubernetes.io/projected/4e1fed36-f837-4e3d-86fa-a4f05ef0a7e9-kube-api-access-vgf4w\") pod \"machine-approver-56656f9798-842hf\" (UID: \"4e1fed36-f837-4e3d-86fa-a4f05ef0a7e9\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-842hf" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.260574 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/be7d6fb4-1943-4d7f-87a8-ce906ed31cf0-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-bz6vn\" (UID: \"be7d6fb4-1943-4d7f-87a8-ce906ed31cf0\") " pod="openshift-authentication/oauth-openshift-558db77b4-bz6vn" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.260625 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/be7d6fb4-1943-4d7f-87a8-ce906ed31cf0-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-bz6vn\" (UID: \"be7d6fb4-1943-4d7f-87a8-ce906ed31cf0\") " pod="openshift-authentication/oauth-openshift-558db77b4-bz6vn" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.260673 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xr6bd\" (UniqueName: \"kubernetes.io/projected/be7d6fb4-1943-4d7f-87a8-ce906ed31cf0-kube-api-access-xr6bd\") pod \"oauth-openshift-558db77b4-bz6vn\" (UID: \"be7d6fb4-1943-4d7f-87a8-ce906ed31cf0\") " pod="openshift-authentication/oauth-openshift-558db77b4-bz6vn" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.260723 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b7f45d1c-1e38-42a4-a4e6-b682787d73cd-serving-cert\") pod \"console-operator-58897d9998-vmhjk\" (UID: \"b7f45d1c-1e38-42a4-a4e6-b682787d73cd\") " pod="openshift-console-operator/console-operator-58897d9998-vmhjk" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.261054 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/34cb015e-a7ba-4cea-a2fc-ab3b101299ce-service-ca\") pod \"console-f9d7485db-m5qlx\" (UID: \"34cb015e-a7ba-4cea-a2fc-ab3b101299ce\") " pod="openshift-console/console-f9d7485db-m5qlx" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.261110 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/be7d6fb4-1943-4d7f-87a8-ce906ed31cf0-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-bz6vn\" (UID: \"be7d6fb4-1943-4d7f-87a8-ce906ed31cf0\") " pod="openshift-authentication/oauth-openshift-558db77b4-bz6vn" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.261167 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/9b135157-3c40-4f85-84a4-5c521ccbf1bd-image-import-ca\") pod \"apiserver-76f77b778f-pb92d\" (UID: \"9b135157-3c40-4f85-84a4-5c521ccbf1bd\") " pod="openshift-apiserver/apiserver-76f77b778f-pb92d" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.261204 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5878306c-90e4-4924-b23c-90079c35d2dc-bound-sa-token\") pod \"ingress-operator-5b745b69d9-4nf42\" (UID: \"5878306c-90e4-4924-b23c-90079c35d2dc\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4nf42" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.261239 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/4e1fed36-f837-4e3d-86fa-a4f05ef0a7e9-auth-proxy-config\") pod \"machine-approver-56656f9798-842hf\" (UID: \"4e1fed36-f837-4e3d-86fa-a4f05ef0a7e9\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-842hf" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.261271 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/cffc1277-9912-47dd-80a3-d724149e420c-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-4fv46\" (UID: \"cffc1277-9912-47dd-80a3-d724149e420c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4fv46" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.261306 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/be7d6fb4-1943-4d7f-87a8-ce906ed31cf0-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-bz6vn\" (UID: \"be7d6fb4-1943-4d7f-87a8-ce906ed31cf0\") " pod="openshift-authentication/oauth-openshift-558db77b4-bz6vn" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.266993 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.287444 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.307150 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.328175 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.361829 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 11 03:55:51 crc kubenswrapper[4703]: E1011 03:55:51.361936 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-11 03:55:51.861916964 +0000 UTC m=+43.072398886 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.362133 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/9b135157-3c40-4f85-84a4-5c521ccbf1bd-encryption-config\") pod \"apiserver-76f77b778f-pb92d\" (UID: \"9b135157-3c40-4f85-84a4-5c521ccbf1bd\") " pod="openshift-apiserver/apiserver-76f77b778f-pb92d" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.362156 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/be7d6fb4-1943-4d7f-87a8-ce906ed31cf0-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-bz6vn\" (UID: \"be7d6fb4-1943-4d7f-87a8-ce906ed31cf0\") " pod="openshift-authentication/oauth-openshift-558db77b4-bz6vn" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.363135 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cffc1277-9912-47dd-80a3-d724149e420c-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-4fv46\" (UID: \"cffc1277-9912-47dd-80a3-d724149e420c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4fv46" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.363159 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/5938c6ea-4d1a-4ca7-bbc3-0bf2b1e9e6bb-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-hrr65\" (UID: \"5938c6ea-4d1a-4ca7-bbc3-0bf2b1e9e6bb\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-hrr65" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.363182 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t8qr8\" (UniqueName: \"kubernetes.io/projected/ff55314a-7f2f-42c9-9723-2fdaad791959-kube-api-access-t8qr8\") pod \"controller-manager-879f6c89f-9lb9d\" (UID: \"ff55314a-7f2f-42c9-9723-2fdaad791959\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9lb9d" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.363208 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pktv9\" (UniqueName: \"kubernetes.io/projected/62aed52a-c801-4ec0-b96b-6132de0e4200-kube-api-access-pktv9\") pod \"authentication-operator-69f744f599-q8rmz\" (UID: \"62aed52a-c801-4ec0-b96b-6132de0e4200\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-q8rmz" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.363224 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/dfc1634d-27c3-4dd8-a8b4-0d9177961ee9-etcd-ca\") pod \"etcd-operator-b45778765-dfgkf\" (UID: \"dfc1634d-27c3-4dd8-a8b4-0d9177961ee9\") " pod="openshift-etcd-operator/etcd-operator-b45778765-dfgkf" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.363242 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5b7cae24-659a-4bda-b925-d74698801dd9-config-volume\") pod \"dns-default-rf7jf\" (UID: \"5b7cae24-659a-4bda-b925-d74698801dd9\") " pod="openshift-dns/dns-default-rf7jf" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.363259 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2c0331e0-7b42-4227-8d50-d4c66a927dea-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-b5vqb\" (UID: \"2c0331e0-7b42-4227-8d50-d4c66a927dea\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-b5vqb" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.363279 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a9831579-a012-4c7d-a48c-2c0eee55cc36-config\") pod \"kube-apiserver-operator-766d6c64bb-8whdq\" (UID: \"a9831579-a012-4c7d-a48c-2c0eee55cc36\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-8whdq" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.363293 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/cffc1277-9912-47dd-80a3-d724149e420c-audit-policies\") pod \"apiserver-7bbb656c7d-4fv46\" (UID: \"cffc1277-9912-47dd-80a3-d724149e420c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4fv46" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.363554 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/be7d6fb4-1943-4d7f-87a8-ce906ed31cf0-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-bz6vn\" (UID: \"be7d6fb4-1943-4d7f-87a8-ce906ed31cf0\") " pod="openshift-authentication/oauth-openshift-558db77b4-bz6vn" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.363737 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wd47f\" (UniqueName: \"kubernetes.io/projected/886a9919-144b-4a33-81ac-90b79512aa6a-kube-api-access-wd47f\") pod \"machine-config-operator-74547568cd-b5mqn\" (UID: \"886a9919-144b-4a33-81ac-90b79512aa6a\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-b5mqn" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.363796 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sdbzq\" (UniqueName: \"kubernetes.io/projected/e25f5a36-9626-41eb-82b0-6f33dfaf3001-kube-api-access-sdbzq\") pod \"dns-operator-744455d44c-6rj5d\" (UID: \"e25f5a36-9626-41eb-82b0-6f33dfaf3001\") " pod="openshift-dns-operator/dns-operator-744455d44c-6rj5d" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.363813 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7tg67\" (UniqueName: \"kubernetes.io/projected/3e81d771-d34a-4a73-b0fd-0ea7e72f9e25-kube-api-access-7tg67\") pod \"openshift-config-operator-7777fb866f-xh9xx\" (UID: \"3e81d771-d34a-4a73-b0fd-0ea7e72f9e25\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-xh9xx" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.363825 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cffc1277-9912-47dd-80a3-d724149e420c-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-4fv46\" (UID: \"cffc1277-9912-47dd-80a3-d724149e420c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4fv46" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.363832 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tbzsb\" (UniqueName: \"kubernetes.io/projected/83320069-4d0c-4e40-8a7f-b3bc4beda7a1-kube-api-access-tbzsb\") pod \"collect-profiles-29335905-4jslx\" (UID: \"83320069-4d0c-4e40-8a7f-b3bc4beda7a1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29335905-4jslx" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.364323 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0a9d5e17-f5b5-4f0e-81e2-4edbe41c4e9f-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-w79s7\" (UID: \"0a9d5e17-f5b5-4f0e-81e2-4edbe41c4e9f\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-w79s7" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.364408 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nx4ks\" (UniqueName: \"kubernetes.io/projected/5938c6ea-4d1a-4ca7-bbc3-0bf2b1e9e6bb-kube-api-access-nx4ks\") pod \"multus-admission-controller-857f4d67dd-hrr65\" (UID: \"5938c6ea-4d1a-4ca7-bbc3-0bf2b1e9e6bb\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-hrr65" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.364448 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/cffc1277-9912-47dd-80a3-d724149e420c-audit-policies\") pod \"apiserver-7bbb656c7d-4fv46\" (UID: \"cffc1277-9912-47dd-80a3-d724149e420c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4fv46" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.364594 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a9831579-a012-4c7d-a48c-2c0eee55cc36-config\") pod \"kube-apiserver-operator-766d6c64bb-8whdq\" (UID: \"a9831579-a012-4c7d-a48c-2c0eee55cc36\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-8whdq" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.364575 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c456c157-fdbc-483e-b2ca-ceb0d2b6247b-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-q926p\" (UID: \"c456c157-fdbc-483e-b2ca-ceb0d2b6247b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-q926p" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.364689 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0a9d5e17-f5b5-4f0e-81e2-4edbe41c4e9f-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-w79s7\" (UID: \"0a9d5e17-f5b5-4f0e-81e2-4edbe41c4e9f\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-w79s7" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.364733 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/9b135157-3c40-4f85-84a4-5c521ccbf1bd-audit-dir\") pod \"apiserver-76f77b778f-pb92d\" (UID: \"9b135157-3c40-4f85-84a4-5c521ccbf1bd\") " pod="openshift-apiserver/apiserver-76f77b778f-pb92d" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.364775 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/01618421-523c-44ab-b631-f2624ba8ab2d-ready\") pod \"cni-sysctl-allowlist-ds-69n87\" (UID: \"01618421-523c-44ab-b631-f2624ba8ab2d\") " pod="openshift-multus/cni-sysctl-allowlist-ds-69n87" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.364813 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/62aed52a-c801-4ec0-b96b-6132de0e4200-service-ca-bundle\") pod \"authentication-operator-69f744f599-q8rmz\" (UID: \"62aed52a-c801-4ec0-b96b-6132de0e4200\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-q8rmz" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.364850 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4qm7z\" (UniqueName: \"kubernetes.io/projected/fd6eb215-d833-4726-b13c-54c53ca1e7e5-kube-api-access-4qm7z\") pod \"package-server-manager-789f6589d5-75knh\" (UID: \"fd6eb215-d833-4726-b13c-54c53ca1e7e5\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-75knh" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.364889 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/92095718-20ba-4b03-b949-e9a26009e283-bound-sa-token\") pod \"image-registry-697d97f7c8-q44w9\" (UID: \"92095718-20ba-4b03-b949-e9a26009e283\") " pod="openshift-image-registry/image-registry-697d97f7c8-q44w9" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.364932 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/3e81d771-d34a-4a73-b0fd-0ea7e72f9e25-available-featuregates\") pod \"openshift-config-operator-7777fb866f-xh9xx\" (UID: \"3e81d771-d34a-4a73-b0fd-0ea7e72f9e25\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-xh9xx" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.364967 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dfc1634d-27c3-4dd8-a8b4-0d9177961ee9-serving-cert\") pod \"etcd-operator-b45778765-dfgkf\" (UID: \"dfc1634d-27c3-4dd8-a8b4-0d9177961ee9\") " pod="openshift-etcd-operator/etcd-operator-b45778765-dfgkf" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.364990 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/9b135157-3c40-4f85-84a4-5c521ccbf1bd-audit-dir\") pod \"apiserver-76f77b778f-pb92d\" (UID: \"9b135157-3c40-4f85-84a4-5c521ccbf1bd\") " pod="openshift-apiserver/apiserver-76f77b778f-pb92d" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.365000 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/92095718-20ba-4b03-b949-e9a26009e283-trusted-ca\") pod \"image-registry-697d97f7c8-q44w9\" (UID: \"92095718-20ba-4b03-b949-e9a26009e283\") " pod="openshift-image-registry/image-registry-697d97f7c8-q44w9" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.365082 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b70b31eb-da80-4680-988d-7dd5d8ab1fe6-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-bdzmz\" (UID: \"b70b31eb-da80-4680-988d-7dd5d8ab1fe6\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-bdzmz" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.365114 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ff55314a-7f2f-42c9-9723-2fdaad791959-config\") pod \"controller-manager-879f6c89f-9lb9d\" (UID: \"ff55314a-7f2f-42c9-9723-2fdaad791959\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9lb9d" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.365161 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ff55314a-7f2f-42c9-9723-2fdaad791959-serving-cert\") pod \"controller-manager-879f6c89f-9lb9d\" (UID: \"ff55314a-7f2f-42c9-9723-2fdaad791959\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9lb9d" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.365186 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a9831579-a012-4c7d-a48c-2c0eee55cc36-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-8whdq\" (UID: \"a9831579-a012-4c7d-a48c-2c0eee55cc36\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-8whdq" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.365234 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/62aed52a-c801-4ec0-b96b-6132de0e4200-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-q8rmz\" (UID: \"62aed52a-c801-4ec0-b96b-6132de0e4200\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-q8rmz" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.366125 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/62aed52a-c801-4ec0-b96b-6132de0e4200-service-ca-bundle\") pod \"authentication-operator-69f744f599-q8rmz\" (UID: \"62aed52a-c801-4ec0-b96b-6132de0e4200\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-q8rmz" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.366167 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0a9d5e17-f5b5-4f0e-81e2-4edbe41c4e9f-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-w79s7\" (UID: \"0a9d5e17-f5b5-4f0e-81e2-4edbe41c4e9f\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-w79s7" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.366883 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/62aed52a-c801-4ec0-b96b-6132de0e4200-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-q8rmz\" (UID: \"62aed52a-c801-4ec0-b96b-6132de0e4200\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-q8rmz" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.365261 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/be7d6fb4-1943-4d7f-87a8-ce906ed31cf0-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-bz6vn\" (UID: \"be7d6fb4-1943-4d7f-87a8-ce906ed31cf0\") " pod="openshift-authentication/oauth-openshift-558db77b4-bz6vn" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.366987 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/59fc390f-24d6-4f16-855b-475f50c6f4b0-mountpoint-dir\") pod \"csi-hostpathplugin-728m8\" (UID: \"59fc390f-24d6-4f16-855b-475f50c6f4b0\") " pod="hostpath-provisioner/csi-hostpathplugin-728m8" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.367016 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f62c5\" (UniqueName: \"kubernetes.io/projected/5878306c-90e4-4924-b23c-90079c35d2dc-kube-api-access-f62c5\") pod \"ingress-operator-5b745b69d9-4nf42\" (UID: \"5878306c-90e4-4924-b23c-90079c35d2dc\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4nf42" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.367052 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/be7d6fb4-1943-4d7f-87a8-ce906ed31cf0-audit-policies\") pod \"oauth-openshift-558db77b4-bz6vn\" (UID: \"be7d6fb4-1943-4d7f-87a8-ce906ed31cf0\") " pod="openshift-authentication/oauth-openshift-558db77b4-bz6vn" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.367072 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/83320069-4d0c-4e40-8a7f-b3bc4beda7a1-config-volume\") pod \"collect-profiles-29335905-4jslx\" (UID: \"83320069-4d0c-4e40-8a7f-b3bc4beda7a1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29335905-4jslx" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.367090 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/62aed52a-c801-4ec0-b96b-6132de0e4200-config\") pod \"authentication-operator-69f744f599-q8rmz\" (UID: \"62aed52a-c801-4ec0-b96b-6132de0e4200\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-q8rmz" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.367111 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7580d2bf-4b66-4dfb-9693-ebcc64225c58-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-bcgcx\" (UID: \"7580d2bf-4b66-4dfb-9693-ebcc64225c58\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-bcgcx" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.367129 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/be7d6fb4-1943-4d7f-87a8-ce906ed31cf0-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-bz6vn\" (UID: \"be7d6fb4-1943-4d7f-87a8-ce906ed31cf0\") " pod="openshift-authentication/oauth-openshift-558db77b4-bz6vn" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.367147 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/be7d6fb4-1943-4d7f-87a8-ce906ed31cf0-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-bz6vn\" (UID: \"be7d6fb4-1943-4d7f-87a8-ce906ed31cf0\") " pod="openshift-authentication/oauth-openshift-558db77b4-bz6vn" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.367165 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3516a340-0474-422a-bc3c-4424e08c17b0-service-ca-bundle\") pod \"router-default-5444994796-mlb5q\" (UID: \"3516a340-0474-422a-bc3c-4424e08c17b0\") " pod="openshift-ingress/router-default-5444994796-mlb5q" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.367186 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/34cb015e-a7ba-4cea-a2fc-ab3b101299ce-service-ca\") pod \"console-f9d7485db-m5qlx\" (UID: \"34cb015e-a7ba-4cea-a2fc-ab3b101299ce\") " pod="openshift-console/console-f9d7485db-m5qlx" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.367204 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5878306c-90e4-4924-b23c-90079c35d2dc-bound-sa-token\") pod \"ingress-operator-5b745b69d9-4nf42\" (UID: \"5878306c-90e4-4924-b23c-90079c35d2dc\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4nf42" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.367223 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/34cb015e-a7ba-4cea-a2fc-ab3b101299ce-console-serving-cert\") pod \"console-f9d7485db-m5qlx\" (UID: \"34cb015e-a7ba-4cea-a2fc-ab3b101299ce\") " pod="openshift-console/console-f9d7485db-m5qlx" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.367240 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cffc1277-9912-47dd-80a3-d724149e420c-serving-cert\") pod \"apiserver-7bbb656c7d-4fv46\" (UID: \"cffc1277-9912-47dd-80a3-d724149e420c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4fv46" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.367257 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/291efcc3-da30-4a56-855c-8ee24e840b26-signing-cabundle\") pod \"service-ca-9c57cc56f-8wq6t\" (UID: \"291efcc3-da30-4a56-855c-8ee24e840b26\") " pod="openshift-service-ca/service-ca-9c57cc56f-8wq6t" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.367288 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ff55314a-7f2f-42c9-9723-2fdaad791959-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-9lb9d\" (UID: \"ff55314a-7f2f-42c9-9723-2fdaad791959\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9lb9d" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.367306 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c95nl\" (UniqueName: \"kubernetes.io/projected/0b2f916a-1f36-4ff0-96df-135bcfc2d5f8-kube-api-access-c95nl\") pod \"catalog-operator-68c6474976-dr9rm\" (UID: \"0b2f916a-1f36-4ff0-96df-135bcfc2d5f8\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-dr9rm" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.367321 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4e1fed36-f837-4e3d-86fa-a4f05ef0a7e9-config\") pod \"machine-approver-56656f9798-842hf\" (UID: \"4e1fed36-f837-4e3d-86fa-a4f05ef0a7e9\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-842hf" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.367336 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/dfc1634d-27c3-4dd8-a8b4-0d9177961ee9-etcd-service-ca\") pod \"etcd-operator-b45778765-dfgkf\" (UID: \"dfc1634d-27c3-4dd8-a8b4-0d9177961ee9\") " pod="openshift-etcd-operator/etcd-operator-b45778765-dfgkf" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.367353 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/92095718-20ba-4b03-b949-e9a26009e283-registry-certificates\") pod \"image-registry-697d97f7c8-q44w9\" (UID: \"92095718-20ba-4b03-b949-e9a26009e283\") " pod="openshift-image-registry/image-registry-697d97f7c8-q44w9" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.367370 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a9831579-a012-4c7d-a48c-2c0eee55cc36-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-8whdq\" (UID: \"a9831579-a012-4c7d-a48c-2c0eee55cc36\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-8whdq" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.367386 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/beb46851-cb80-4329-85b4-c2c87ebeb836-webhook-cert\") pod \"packageserver-d55dfcdfc-vmsw5\" (UID: \"beb46851-cb80-4329-85b4-c2c87ebeb836\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vmsw5" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.367401 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/dfc1634d-27c3-4dd8-a8b4-0d9177961ee9-etcd-client\") pod \"etcd-operator-b45778765-dfgkf\" (UID: \"dfc1634d-27c3-4dd8-a8b4-0d9177961ee9\") " pod="openshift-etcd-operator/etcd-operator-b45778765-dfgkf" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.367416 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/59fc390f-24d6-4f16-855b-475f50c6f4b0-registration-dir\") pod \"csi-hostpathplugin-728m8\" (UID: \"59fc390f-24d6-4f16-855b-475f50c6f4b0\") " pod="hostpath-provisioner/csi-hostpathplugin-728m8" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.367432 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b70b31eb-da80-4680-988d-7dd5d8ab1fe6-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-bdzmz\" (UID: \"b70b31eb-da80-4680-988d-7dd5d8ab1fe6\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-bdzmz" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.367448 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7580d2bf-4b66-4dfb-9693-ebcc64225c58-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-bcgcx\" (UID: \"7580d2bf-4b66-4dfb-9693-ebcc64225c58\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-bcgcx" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.367480 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q86hk\" (UniqueName: \"kubernetes.io/projected/5b7cae24-659a-4bda-b925-d74698801dd9-kube-api-access-q86hk\") pod \"dns-default-rf7jf\" (UID: \"5b7cae24-659a-4bda-b925-d74698801dd9\") " pod="openshift-dns/dns-default-rf7jf" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.367490 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ff55314a-7f2f-42c9-9723-2fdaad791959-config\") pod \"controller-manager-879f6c89f-9lb9d\" (UID: \"ff55314a-7f2f-42c9-9723-2fdaad791959\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9lb9d" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.367840 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/92095718-20ba-4b03-b949-e9a26009e283-trusted-ca\") pod \"image-registry-697d97f7c8-q44w9\" (UID: \"92095718-20ba-4b03-b949-e9a26009e283\") " pod="openshift-image-registry/image-registry-697d97f7c8-q44w9" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.367503 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kn4n5\" (UniqueName: \"kubernetes.io/projected/2c0331e0-7b42-4227-8d50-d4c66a927dea-kube-api-access-kn4n5\") pod \"kube-storage-version-migrator-operator-b67b599dd-b5vqb\" (UID: \"2c0331e0-7b42-4227-8d50-d4c66a927dea\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-b5vqb" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.368085 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-88q5n\" (UniqueName: \"kubernetes.io/projected/c456c157-fdbc-483e-b2ca-ceb0d2b6247b-kube-api-access-88q5n\") pod \"cluster-image-registry-operator-dc59b4c8b-q926p\" (UID: \"c456c157-fdbc-483e-b2ca-ceb0d2b6247b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-q926p" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.368115 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b135157-3c40-4f85-84a4-5c521ccbf1bd-config\") pod \"apiserver-76f77b778f-pb92d\" (UID: \"9b135157-3c40-4f85-84a4-5c521ccbf1bd\") " pod="openshift-apiserver/apiserver-76f77b778f-pb92d" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.368138 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/9b135157-3c40-4f85-84a4-5c521ccbf1bd-audit\") pod \"apiserver-76f77b778f-pb92d\" (UID: \"9b135157-3c40-4f85-84a4-5c521ccbf1bd\") " pod="openshift-apiserver/apiserver-76f77b778f-pb92d" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.368163 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/0b4045ea-f5af-4576-9ada-baddba2cc319-srv-cert\") pod \"olm-operator-6b444d44fb-z7wfl\" (UID: \"0b4045ea-f5af-4576-9ada-baddba2cc319\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-z7wfl" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.368185 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ssw2c\" (UniqueName: \"kubernetes.io/projected/beb46851-cb80-4329-85b4-c2c87ebeb836-kube-api-access-ssw2c\") pod \"packageserver-d55dfcdfc-vmsw5\" (UID: \"beb46851-cb80-4329-85b4-c2c87ebeb836\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vmsw5" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.368288 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/34cb015e-a7ba-4cea-a2fc-ab3b101299ce-console-config\") pod \"console-f9d7485db-m5qlx\" (UID: \"34cb015e-a7ba-4cea-a2fc-ab3b101299ce\") " pod="openshift-console/console-f9d7485db-m5qlx" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.368311 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7cs8d\" (UniqueName: \"kubernetes.io/projected/34cb015e-a7ba-4cea-a2fc-ab3b101299ce-kube-api-access-7cs8d\") pod \"console-f9d7485db-m5qlx\" (UID: \"34cb015e-a7ba-4cea-a2fc-ab3b101299ce\") " pod="openshift-console/console-f9d7485db-m5qlx" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.368342 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8fa6f46d-74a5-4e8b-b112-55ed455b5409-config\") pod \"service-ca-operator-777779d784-psx9s\" (UID: \"8fa6f46d-74a5-4e8b-b112-55ed455b5409\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-psx9s" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.368366 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9b135157-3c40-4f85-84a4-5c521ccbf1bd-trusted-ca-bundle\") pod \"apiserver-76f77b778f-pb92d\" (UID: \"9b135157-3c40-4f85-84a4-5c521ccbf1bd\") " pod="openshift-apiserver/apiserver-76f77b778f-pb92d" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.368385 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8fwp8\" (UniqueName: \"kubernetes.io/projected/be037531-a087-4b3e-9327-f28d38fd3961-kube-api-access-8fwp8\") pod \"cluster-samples-operator-665b6dd947-h9vv2\" (UID: \"be037531-a087-4b3e-9327-f28d38fd3961\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-h9vv2" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.368407 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/34cb015e-a7ba-4cea-a2fc-ab3b101299ce-console-oauth-config\") pod \"console-f9d7485db-m5qlx\" (UID: \"34cb015e-a7ba-4cea-a2fc-ab3b101299ce\") " pod="openshift-console/console-f9d7485db-m5qlx" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.368429 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b8bhl\" (UniqueName: \"kubernetes.io/projected/8fa6f46d-74a5-4e8b-b112-55ed455b5409-kube-api-access-b8bhl\") pod \"service-ca-operator-777779d784-psx9s\" (UID: \"8fa6f46d-74a5-4e8b-b112-55ed455b5409\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-psx9s" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.368454 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/cffc1277-9912-47dd-80a3-d724149e420c-audit-dir\") pod \"apiserver-7bbb656c7d-4fv46\" (UID: \"cffc1277-9912-47dd-80a3-d724149e420c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4fv46" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.368498 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/01e5cd3c-edc8-486e-9b3c-9ec1ca513fcc-certs\") pod \"machine-config-server-6588z\" (UID: \"01e5cd3c-edc8-486e-9b3c-9ec1ca513fcc\") " pod="openshift-machine-config-operator/machine-config-server-6588z" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.368524 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-79f5m\" (UniqueName: \"kubernetes.io/projected/b7f45d1c-1e38-42a4-a4e6-b682787d73cd-kube-api-access-79f5m\") pod \"console-operator-58897d9998-vmhjk\" (UID: \"b7f45d1c-1e38-42a4-a4e6-b682787d73cd\") " pod="openshift-console-operator/console-operator-58897d9998-vmhjk" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.368544 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/34cb015e-a7ba-4cea-a2fc-ab3b101299ce-trusted-ca-bundle\") pod \"console-f9d7485db-m5qlx\" (UID: \"34cb015e-a7ba-4cea-a2fc-ab3b101299ce\") " pod="openshift-console/console-f9d7485db-m5qlx" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.368565 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/beb46851-cb80-4329-85b4-c2c87ebeb836-tmpfs\") pod \"packageserver-d55dfcdfc-vmsw5\" (UID: \"beb46851-cb80-4329-85b4-c2c87ebeb836\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vmsw5" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.368594 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/9b135157-3c40-4f85-84a4-5c521ccbf1bd-node-pullsecrets\") pod \"apiserver-76f77b778f-pb92d\" (UID: \"9b135157-3c40-4f85-84a4-5c521ccbf1bd\") " pod="openshift-apiserver/apiserver-76f77b778f-pb92d" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.368632 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/886a9919-144b-4a33-81ac-90b79512aa6a-images\") pod \"machine-config-operator-74547568cd-b5mqn\" (UID: \"886a9919-144b-4a33-81ac-90b79512aa6a\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-b5mqn" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.368656 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-twlw5\" (UniqueName: \"kubernetes.io/projected/0b4045ea-f5af-4576-9ada-baddba2cc319-kube-api-access-twlw5\") pod \"olm-operator-6b444d44fb-z7wfl\" (UID: \"0b4045ea-f5af-4576-9ada-baddba2cc319\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-z7wfl" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.368795 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/9b135157-3c40-4f85-84a4-5c521ccbf1bd-encryption-config\") pod \"apiserver-76f77b778f-pb92d\" (UID: \"9b135157-3c40-4f85-84a4-5c521ccbf1bd\") " pod="openshift-apiserver/apiserver-76f77b778f-pb92d" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.368852 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/92095718-20ba-4b03-b949-e9a26009e283-registry-tls\") pod \"image-registry-697d97f7c8-q44w9\" (UID: \"92095718-20ba-4b03-b949-e9a26009e283\") " pod="openshift-image-registry/image-registry-697d97f7c8-q44w9" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.368884 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/59fc390f-24d6-4f16-855b-475f50c6f4b0-socket-dir\") pod \"csi-hostpathplugin-728m8\" (UID: \"59fc390f-24d6-4f16-855b-475f50c6f4b0\") " pod="hostpath-provisioner/csi-hostpathplugin-728m8" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.368914 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vnqrh\" (UniqueName: \"kubernetes.io/projected/9b135157-3c40-4f85-84a4-5c521ccbf1bd-kube-api-access-vnqrh\") pod \"apiserver-76f77b778f-pb92d\" (UID: \"9b135157-3c40-4f85-84a4-5c521ccbf1bd\") " pod="openshift-apiserver/apiserver-76f77b778f-pb92d" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.368941 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c7664ffa-e88d-468b-b43d-3b3b6ec5195b-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-tjd96\" (UID: \"c7664ffa-e88d-468b-b43d-3b3b6ec5195b\") " pod="openshift-marketplace/marketplace-operator-79b997595-tjd96" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.368968 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/34d64762-444d-41a0-bf17-d8de6712a90f-proxy-tls\") pod \"machine-config-controller-84d6567774-4v67m\" (UID: \"34d64762-444d-41a0-bf17-d8de6712a90f\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-4v67m" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.368993 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/291efcc3-da30-4a56-855c-8ee24e840b26-signing-key\") pod \"service-ca-9c57cc56f-8wq6t\" (UID: \"291efcc3-da30-4a56-855c-8ee24e840b26\") " pod="openshift-service-ca/service-ca-9c57cc56f-8wq6t" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.369021 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/95f6c15c-5422-4ef6-a4a8-108959afcae6-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-8wrsq\" (UID: \"95f6c15c-5422-4ef6-a4a8-108959afcae6\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-8wrsq" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.369046 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/fd6eb215-d833-4726-b13c-54c53ca1e7e5-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-75knh\" (UID: \"fd6eb215-d833-4726-b13c-54c53ca1e7e5\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-75knh" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.369071 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nb2qg\" (UniqueName: \"kubernetes.io/projected/34d64762-444d-41a0-bf17-d8de6712a90f-kube-api-access-nb2qg\") pod \"machine-config-controller-84d6567774-4v67m\" (UID: \"34d64762-444d-41a0-bf17-d8de6712a90f\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-4v67m" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.369092 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/01618421-523c-44ab-b631-f2624ba8ab2d-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-69n87\" (UID: \"01618421-523c-44ab-b631-f2624ba8ab2d\") " pod="openshift-multus/cni-sysctl-allowlist-ds-69n87" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.369117 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9425edee-98a4-4086-8d31-28478469501b-client-ca\") pod \"route-controller-manager-6576b87f9c-rz7sl\" (UID: \"9425edee-98a4-4086-8d31-28478469501b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rz7sl" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.369141 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/be7d6fb4-1943-4d7f-87a8-ce906ed31cf0-audit-dir\") pod \"oauth-openshift-558db77b4-bz6vn\" (UID: \"be7d6fb4-1943-4d7f-87a8-ce906ed31cf0\") " pod="openshift-authentication/oauth-openshift-558db77b4-bz6vn" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.369165 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ncmqc\" (UniqueName: \"kubernetes.io/projected/59fc390f-24d6-4f16-855b-475f50c6f4b0-kube-api-access-ncmqc\") pod \"csi-hostpathplugin-728m8\" (UID: \"59fc390f-24d6-4f16-855b-475f50c6f4b0\") " pod="hostpath-provisioner/csi-hostpathplugin-728m8" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.369187 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/59fc390f-24d6-4f16-855b-475f50c6f4b0-csi-data-dir\") pod \"csi-hostpathplugin-728m8\" (UID: \"59fc390f-24d6-4f16-855b-475f50c6f4b0\") " pod="hostpath-provisioner/csi-hostpathplugin-728m8" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.369212 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/92095718-20ba-4b03-b949-e9a26009e283-ca-trust-extracted\") pod \"image-registry-697d97f7c8-q44w9\" (UID: \"92095718-20ba-4b03-b949-e9a26009e283\") " pod="openshift-image-registry/image-registry-697d97f7c8-q44w9" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.369235 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-98w9g\" (UniqueName: \"kubernetes.io/projected/92095718-20ba-4b03-b949-e9a26009e283-kube-api-access-98w9g\") pod \"image-registry-697d97f7c8-q44w9\" (UID: \"92095718-20ba-4b03-b949-e9a26009e283\") " pod="openshift-image-registry/image-registry-697d97f7c8-q44w9" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.369258 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5878306c-90e4-4924-b23c-90079c35d2dc-trusted-ca\") pod \"ingress-operator-5b745b69d9-4nf42\" (UID: \"5878306c-90e4-4924-b23c-90079c35d2dc\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4nf42" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.369279 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/be7d6fb4-1943-4d7f-87a8-ce906ed31cf0-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-bz6vn\" (UID: \"be7d6fb4-1943-4d7f-87a8-ce906ed31cf0\") " pod="openshift-authentication/oauth-openshift-558db77b4-bz6vn" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.369303 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/59fc390f-24d6-4f16-855b-475f50c6f4b0-plugins-dir\") pod \"csi-hostpathplugin-728m8\" (UID: \"59fc390f-24d6-4f16-855b-475f50c6f4b0\") " pod="hostpath-provisioner/csi-hostpathplugin-728m8" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.369305 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b70b31eb-da80-4680-988d-7dd5d8ab1fe6-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-bdzmz\" (UID: \"b70b31eb-da80-4680-988d-7dd5d8ab1fe6\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-bdzmz" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.369326 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c0331e0-7b42-4227-8d50-d4c66a927dea-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-b5vqb\" (UID: \"2c0331e0-7b42-4227-8d50-d4c66a927dea\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-b5vqb" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.369354 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/be7d6fb4-1943-4d7f-87a8-ce906ed31cf0-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-bz6vn\" (UID: \"be7d6fb4-1943-4d7f-87a8-ce906ed31cf0\") " pod="openshift-authentication/oauth-openshift-558db77b4-bz6vn" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.369377 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sh7lt\" (UniqueName: \"kubernetes.io/projected/dfc1634d-27c3-4dd8-a8b4-0d9177961ee9-kube-api-access-sh7lt\") pod \"etcd-operator-b45778765-dfgkf\" (UID: \"dfc1634d-27c3-4dd8-a8b4-0d9177961ee9\") " pod="openshift-etcd-operator/etcd-operator-b45778765-dfgkf" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.369520 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/0b4045ea-f5af-4576-9ada-baddba2cc319-profile-collector-cert\") pod \"olm-operator-6b444d44fb-z7wfl\" (UID: \"0b4045ea-f5af-4576-9ada-baddba2cc319\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-z7wfl" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.369542 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4e1fed36-f837-4e3d-86fa-a4f05ef0a7e9-config\") pod \"machine-approver-56656f9798-842hf\" (UID: \"4e1fed36-f837-4e3d-86fa-a4f05ef0a7e9\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-842hf" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.369553 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/cffc1277-9912-47dd-80a3-d724149e420c-encryption-config\") pod \"apiserver-7bbb656c7d-4fv46\" (UID: \"cffc1277-9912-47dd-80a3-d724149e420c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4fv46" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.369592 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/886a9919-144b-4a33-81ac-90b79512aa6a-auth-proxy-config\") pod \"machine-config-operator-74547568cd-b5mqn\" (UID: \"886a9919-144b-4a33-81ac-90b79512aa6a\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-b5mqn" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.369631 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g7b9c\" (UniqueName: \"kubernetes.io/projected/b70b31eb-da80-4680-988d-7dd5d8ab1fe6-kube-api-access-g7b9c\") pod \"openshift-controller-manager-operator-756b6f6bc6-bdzmz\" (UID: \"b70b31eb-da80-4680-988d-7dd5d8ab1fe6\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-bdzmz" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.369669 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dfc1634d-27c3-4dd8-a8b4-0d9177961ee9-config\") pod \"etcd-operator-b45778765-dfgkf\" (UID: \"dfc1634d-27c3-4dd8-a8b4-0d9177961ee9\") " pod="openshift-etcd-operator/etcd-operator-b45778765-dfgkf" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.369696 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l6ms5\" (UniqueName: \"kubernetes.io/projected/95f6c15c-5422-4ef6-a4a8-108959afcae6-kube-api-access-l6ms5\") pod \"control-plane-machine-set-operator-78cbb6b69f-8wrsq\" (UID: \"95f6c15c-5422-4ef6-a4a8-108959afcae6\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-8wrsq" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.369724 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-znkwt\" (UniqueName: \"kubernetes.io/projected/4e2ba5fe-e94f-4d40-8596-d4b9ad863215-kube-api-access-znkwt\") pod \"ingress-canary-2jlb9\" (UID: \"4e2ba5fe-e94f-4d40-8596-d4b9ad863215\") " pod="openshift-ingress-canary/ingress-canary-2jlb9" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.369758 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/92095718-20ba-4b03-b949-e9a26009e283-installation-pull-secrets\") pod \"image-registry-697d97f7c8-q44w9\" (UID: \"92095718-20ba-4b03-b949-e9a26009e283\") " pod="openshift-image-registry/image-registry-697d97f7c8-q44w9" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.369783 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/c456c157-fdbc-483e-b2ca-ceb0d2b6247b-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-q926p\" (UID: \"c456c157-fdbc-483e-b2ca-ceb0d2b6247b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-q926p" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.369809 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e25f5a36-9626-41eb-82b0-6f33dfaf3001-metrics-tls\") pod \"dns-operator-744455d44c-6rj5d\" (UID: \"e25f5a36-9626-41eb-82b0-6f33dfaf3001\") " pod="openshift-dns-operator/dns-operator-744455d44c-6rj5d" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.369833 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8fa6f46d-74a5-4e8b-b112-55ed455b5409-serving-cert\") pod \"service-ca-operator-777779d784-psx9s\" (UID: \"8fa6f46d-74a5-4e8b-b112-55ed455b5409\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-psx9s" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.369857 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/34d64762-444d-41a0-bf17-d8de6712a90f-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-4v67m\" (UID: \"34d64762-444d-41a0-bf17-d8de6712a90f\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-4v67m" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.369886 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/34cb015e-a7ba-4cea-a2fc-ab3b101299ce-oauth-serving-cert\") pod \"console-f9d7485db-m5qlx\" (UID: \"34cb015e-a7ba-4cea-a2fc-ab3b101299ce\") " pod="openshift-console/console-f9d7485db-m5qlx" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.369911 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/cffc1277-9912-47dd-80a3-d724149e420c-etcd-client\") pod \"apiserver-7bbb656c7d-4fv46\" (UID: \"cffc1277-9912-47dd-80a3-d724149e420c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4fv46" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.369939 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b7f45d1c-1e38-42a4-a4e6-b682787d73cd-trusted-ca\") pod \"console-operator-58897d9998-vmhjk\" (UID: \"b7f45d1c-1e38-42a4-a4e6-b682787d73cd\") " pod="openshift-console-operator/console-operator-58897d9998-vmhjk" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.369964 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7580d2bf-4b66-4dfb-9693-ebcc64225c58-config\") pod \"kube-controller-manager-operator-78b949d7b-bcgcx\" (UID: \"7580d2bf-4b66-4dfb-9693-ebcc64225c58\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-bcgcx" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.369988 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vgf4w\" (UniqueName: \"kubernetes.io/projected/4e1fed36-f837-4e3d-86fa-a4f05ef0a7e9-kube-api-access-vgf4w\") pod \"machine-approver-56656f9798-842hf\" (UID: \"4e1fed36-f837-4e3d-86fa-a4f05ef0a7e9\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-842hf" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.370011 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xr6bd\" (UniqueName: \"kubernetes.io/projected/be7d6fb4-1943-4d7f-87a8-ce906ed31cf0-kube-api-access-xr6bd\") pod \"oauth-openshift-558db77b4-bz6vn\" (UID: \"be7d6fb4-1943-4d7f-87a8-ce906ed31cf0\") " pod="openshift-authentication/oauth-openshift-558db77b4-bz6vn" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.370034 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cst4c\" (UniqueName: \"kubernetes.io/projected/291efcc3-da30-4a56-855c-8ee24e840b26-kube-api-access-cst4c\") pod \"service-ca-9c57cc56f-8wq6t\" (UID: \"291efcc3-da30-4a56-855c-8ee24e840b26\") " pod="openshift-service-ca/service-ca-9c57cc56f-8wq6t" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.370057 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/01e5cd3c-edc8-486e-9b3c-9ec1ca513fcc-node-bootstrap-token\") pod \"machine-config-server-6588z\" (UID: \"01e5cd3c-edc8-486e-9b3c-9ec1ca513fcc\") " pod="openshift-machine-config-operator/machine-config-server-6588z" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.370081 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b7f45d1c-1e38-42a4-a4e6-b682787d73cd-serving-cert\") pod \"console-operator-58897d9998-vmhjk\" (UID: \"b7f45d1c-1e38-42a4-a4e6-b682787d73cd\") " pod="openshift-console-operator/console-operator-58897d9998-vmhjk" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.370104 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/be7d6fb4-1943-4d7f-87a8-ce906ed31cf0-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-bz6vn\" (UID: \"be7d6fb4-1943-4d7f-87a8-ce906ed31cf0\") " pod="openshift-authentication/oauth-openshift-558db77b4-bz6vn" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.370130 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/9b135157-3c40-4f85-84a4-5c521ccbf1bd-image-import-ca\") pod \"apiserver-76f77b778f-pb92d\" (UID: \"9b135157-3c40-4f85-84a4-5c521ccbf1bd\") " pod="openshift-apiserver/apiserver-76f77b778f-pb92d" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.370151 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/4e1fed36-f837-4e3d-86fa-a4f05ef0a7e9-auth-proxy-config\") pod \"machine-approver-56656f9798-842hf\" (UID: \"4e1fed36-f837-4e3d-86fa-a4f05ef0a7e9\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-842hf" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.370174 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/cffc1277-9912-47dd-80a3-d724149e420c-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-4fv46\" (UID: \"cffc1277-9912-47dd-80a3-d724149e420c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4fv46" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.370203 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/be7d6fb4-1943-4d7f-87a8-ce906ed31cf0-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-bz6vn\" (UID: \"be7d6fb4-1943-4d7f-87a8-ce906ed31cf0\") " pod="openshift-authentication/oauth-openshift-558db77b4-bz6vn" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.370226 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-55tmj\" (UniqueName: \"kubernetes.io/projected/01e5cd3c-edc8-486e-9b3c-9ec1ca513fcc-kube-api-access-55tmj\") pod \"machine-config-server-6588z\" (UID: \"01e5cd3c-edc8-486e-9b3c-9ec1ca513fcc\") " pod="openshift-machine-config-operator/machine-config-server-6588z" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.370251 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wxrkv\" (UniqueName: \"kubernetes.io/projected/3516a340-0474-422a-bc3c-4424e08c17b0-kube-api-access-wxrkv\") pod \"router-default-5444994796-mlb5q\" (UID: \"3516a340-0474-422a-bc3c-4424e08c17b0\") " pod="openshift-ingress/router-default-5444994796-mlb5q" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.370272 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4e2ba5fe-e94f-4d40-8596-d4b9ad863215-cert\") pod \"ingress-canary-2jlb9\" (UID: \"4e2ba5fe-e94f-4d40-8596-d4b9ad863215\") " pod="openshift-ingress-canary/ingress-canary-2jlb9" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.370297 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5878306c-90e4-4924-b23c-90079c35d2dc-metrics-tls\") pod \"ingress-operator-5b745b69d9-4nf42\" (UID: \"5878306c-90e4-4924-b23c-90079c35d2dc\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4nf42" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.370322 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-64v9p\" (UniqueName: \"kubernetes.io/projected/cffc1277-9912-47dd-80a3-d724149e420c-kube-api-access-64v9p\") pod \"apiserver-7bbb656c7d-4fv46\" (UID: \"cffc1277-9912-47dd-80a3-d724149e420c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4fv46" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.370347 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c456c157-fdbc-483e-b2ca-ceb0d2b6247b-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-q926p\" (UID: \"c456c157-fdbc-483e-b2ca-ceb0d2b6247b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-q926p" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.370375 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/be7d6fb4-1943-4d7f-87a8-ce906ed31cf0-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-bz6vn\" (UID: \"be7d6fb4-1943-4d7f-87a8-ce906ed31cf0\") " pod="openshift-authentication/oauth-openshift-558db77b4-bz6vn" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.370397 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/83320069-4d0c-4e40-8a7f-b3bc4beda7a1-secret-volume\") pod \"collect-profiles-29335905-4jslx\" (UID: \"83320069-4d0c-4e40-8a7f-b3bc4beda7a1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29335905-4jslx" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.370422 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/be7d6fb4-1943-4d7f-87a8-ce906ed31cf0-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-bz6vn\" (UID: \"be7d6fb4-1943-4d7f-87a8-ce906ed31cf0\") " pod="openshift-authentication/oauth-openshift-558db77b4-bz6vn" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.370446 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/0b2f916a-1f36-4ff0-96df-135bcfc2d5f8-srv-cert\") pod \"catalog-operator-68c6474976-dr9rm\" (UID: \"0b2f916a-1f36-4ff0-96df-135bcfc2d5f8\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-dr9rm" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.370491 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/0b2f916a-1f36-4ff0-96df-135bcfc2d5f8-profile-collector-cert\") pod \"catalog-operator-68c6474976-dr9rm\" (UID: \"0b2f916a-1f36-4ff0-96df-135bcfc2d5f8\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-dr9rm" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.370516 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/c7664ffa-e88d-468b-b43d-3b3b6ec5195b-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-tjd96\" (UID: \"c7664ffa-e88d-468b-b43d-3b3b6ec5195b\") " pod="openshift-marketplace/marketplace-operator-79b997595-tjd96" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.370540 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5b7cae24-659a-4bda-b925-d74698801dd9-metrics-tls\") pod \"dns-default-rf7jf\" (UID: \"5b7cae24-659a-4bda-b925-d74698801dd9\") " pod="openshift-dns/dns-default-rf7jf" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.370560 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/beb46851-cb80-4329-85b4-c2c87ebeb836-apiservice-cert\") pod \"packageserver-d55dfcdfc-vmsw5\" (UID: \"beb46851-cb80-4329-85b4-c2c87ebeb836\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vmsw5" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.370590 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/9b135157-3c40-4f85-84a4-5c521ccbf1bd-etcd-serving-ca\") pod \"apiserver-76f77b778f-pb92d\" (UID: \"9b135157-3c40-4f85-84a4-5c521ccbf1bd\") " pod="openshift-apiserver/apiserver-76f77b778f-pb92d" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.370615 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/3516a340-0474-422a-bc3c-4424e08c17b0-default-certificate\") pod \"router-default-5444994796-mlb5q\" (UID: \"3516a340-0474-422a-bc3c-4424e08c17b0\") " pod="openshift-ingress/router-default-5444994796-mlb5q" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.370647 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9b135157-3c40-4f85-84a4-5c521ccbf1bd-serving-cert\") pod \"apiserver-76f77b778f-pb92d\" (UID: \"9b135157-3c40-4f85-84a4-5c521ccbf1bd\") " pod="openshift-apiserver/apiserver-76f77b778f-pb92d" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.370678 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/be7d6fb4-1943-4d7f-87a8-ce906ed31cf0-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-bz6vn\" (UID: \"be7d6fb4-1943-4d7f-87a8-ce906ed31cf0\") " pod="openshift-authentication/oauth-openshift-558db77b4-bz6vn" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.370702 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qtg94\" (UniqueName: \"kubernetes.io/projected/c7664ffa-e88d-468b-b43d-3b3b6ec5195b-kube-api-access-qtg94\") pod \"marketplace-operator-79b997595-tjd96\" (UID: \"c7664ffa-e88d-468b-b43d-3b3b6ec5195b\") " pod="openshift-marketplace/marketplace-operator-79b997595-tjd96" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.370727 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fc6bj\" (UniqueName: \"kubernetes.io/projected/01618421-523c-44ab-b631-f2624ba8ab2d-kube-api-access-fc6bj\") pod \"cni-sysctl-allowlist-ds-69n87\" (UID: \"01618421-523c-44ab-b631-f2624ba8ab2d\") " pod="openshift-multus/cni-sysctl-allowlist-ds-69n87" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.370750 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ff55314a-7f2f-42c9-9723-2fdaad791959-client-ca\") pod \"controller-manager-879f6c89f-9lb9d\" (UID: \"ff55314a-7f2f-42c9-9723-2fdaad791959\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9lb9d" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.370773 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/4e1fed36-f837-4e3d-86fa-a4f05ef0a7e9-machine-approver-tls\") pod \"machine-approver-56656f9798-842hf\" (UID: \"4e1fed36-f837-4e3d-86fa-a4f05ef0a7e9\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-842hf" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.370800 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9425edee-98a4-4086-8d31-28478469501b-config\") pod \"route-controller-manager-6576b87f9c-rz7sl\" (UID: \"9425edee-98a4-4086-8d31-28478469501b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rz7sl" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.370880 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0a9d5e17-f5b5-4f0e-81e2-4edbe41c4e9f-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-w79s7\" (UID: \"0a9d5e17-f5b5-4f0e-81e2-4edbe41c4e9f\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-w79s7" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.370905 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/be037531-a087-4b3e-9327-f28d38fd3961-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-h9vv2\" (UID: \"be037531-a087-4b3e-9327-f28d38fd3961\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-h9vv2" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.370929 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/9b135157-3c40-4f85-84a4-5c521ccbf1bd-etcd-client\") pod \"apiserver-76f77b778f-pb92d\" (UID: \"9b135157-3c40-4f85-84a4-5c521ccbf1bd\") " pod="openshift-apiserver/apiserver-76f77b778f-pb92d" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.370954 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9425edee-98a4-4086-8d31-28478469501b-serving-cert\") pod \"route-controller-manager-6576b87f9c-rz7sl\" (UID: \"9425edee-98a4-4086-8d31-28478469501b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rz7sl" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.370977 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/3516a340-0474-422a-bc3c-4424e08c17b0-stats-auth\") pod \"router-default-5444994796-mlb5q\" (UID: \"3516a340-0474-422a-bc3c-4424e08c17b0\") " pod="openshift-ingress/router-default-5444994796-mlb5q" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.371130 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b7f45d1c-1e38-42a4-a4e6-b682787d73cd-config\") pod \"console-operator-58897d9998-vmhjk\" (UID: \"b7f45d1c-1e38-42a4-a4e6-b682787d73cd\") " pod="openshift-console-operator/console-operator-58897d9998-vmhjk" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.371221 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/01618421-523c-44ab-b631-f2624ba8ab2d-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-69n87\" (UID: \"01618421-523c-44ab-b631-f2624ba8ab2d\") " pod="openshift-multus/cni-sysctl-allowlist-ds-69n87" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.371257 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q44w9\" (UID: \"92095718-20ba-4b03-b949-e9a26009e283\") " pod="openshift-image-registry/image-registry-697d97f7c8-q44w9" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.371280 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/886a9919-144b-4a33-81ac-90b79512aa6a-proxy-tls\") pod \"machine-config-operator-74547568cd-b5mqn\" (UID: \"886a9919-144b-4a33-81ac-90b79512aa6a\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-b5mqn" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.371305 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/34cb015e-a7ba-4cea-a2fc-ab3b101299ce-service-ca\") pod \"console-f9d7485db-m5qlx\" (UID: \"34cb015e-a7ba-4cea-a2fc-ab3b101299ce\") " pod="openshift-console/console-f9d7485db-m5qlx" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.371379 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/62aed52a-c801-4ec0-b96b-6132de0e4200-serving-cert\") pod \"authentication-operator-69f744f599-q8rmz\" (UID: \"62aed52a-c801-4ec0-b96b-6132de0e4200\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-q8rmz" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.371411 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3e81d771-d34a-4a73-b0fd-0ea7e72f9e25-serving-cert\") pod \"openshift-config-operator-7777fb866f-xh9xx\" (UID: \"3e81d771-d34a-4a73-b0fd-0ea7e72f9e25\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-xh9xx" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.371447 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3516a340-0474-422a-bc3c-4424e08c17b0-metrics-certs\") pod \"router-default-5444994796-mlb5q\" (UID: \"3516a340-0474-422a-bc3c-4424e08c17b0\") " pod="openshift-ingress/router-default-5444994796-mlb5q" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.371502 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4ffhd\" (UniqueName: \"kubernetes.io/projected/9425edee-98a4-4086-8d31-28478469501b-kube-api-access-4ffhd\") pod \"route-controller-manager-6576b87f9c-rz7sl\" (UID: \"9425edee-98a4-4086-8d31-28478469501b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rz7sl" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.371705 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7580d2bf-4b66-4dfb-9693-ebcc64225c58-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-bcgcx\" (UID: \"7580d2bf-4b66-4dfb-9693-ebcc64225c58\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-bcgcx" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.371851 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ff55314a-7f2f-42c9-9723-2fdaad791959-serving-cert\") pod \"controller-manager-879f6c89f-9lb9d\" (UID: \"ff55314a-7f2f-42c9-9723-2fdaad791959\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9lb9d" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.368882 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ff55314a-7f2f-42c9-9723-2fdaad791959-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-9lb9d\" (UID: \"ff55314a-7f2f-42c9-9723-2fdaad791959\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9lb9d" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.372421 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cffc1277-9912-47dd-80a3-d724149e420c-serving-cert\") pod \"apiserver-7bbb656c7d-4fv46\" (UID: \"cffc1277-9912-47dd-80a3-d724149e420c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4fv46" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.373085 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a9831579-a012-4c7d-a48c-2c0eee55cc36-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-8whdq\" (UID: \"a9831579-a012-4c7d-a48c-2c0eee55cc36\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-8whdq" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.373129 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/92095718-20ba-4b03-b949-e9a26009e283-registry-certificates\") pod \"image-registry-697d97f7c8-q44w9\" (UID: \"92095718-20ba-4b03-b949-e9a26009e283\") " pod="openshift-image-registry/image-registry-697d97f7c8-q44w9" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.375381 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/62aed52a-c801-4ec0-b96b-6132de0e4200-config\") pod \"authentication-operator-69f744f599-q8rmz\" (UID: \"62aed52a-c801-4ec0-b96b-6132de0e4200\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-q8rmz" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.375440 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/be7d6fb4-1943-4d7f-87a8-ce906ed31cf0-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-bz6vn\" (UID: \"be7d6fb4-1943-4d7f-87a8-ce906ed31cf0\") " pod="openshift-authentication/oauth-openshift-558db77b4-bz6vn" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.375710 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9425edee-98a4-4086-8d31-28478469501b-client-ca\") pod \"route-controller-manager-6576b87f9c-rz7sl\" (UID: \"9425edee-98a4-4086-8d31-28478469501b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rz7sl" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.375801 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/be7d6fb4-1943-4d7f-87a8-ce906ed31cf0-audit-dir\") pod \"oauth-openshift-558db77b4-bz6vn\" (UID: \"be7d6fb4-1943-4d7f-87a8-ce906ed31cf0\") " pod="openshift-authentication/oauth-openshift-558db77b4-bz6vn" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.376018 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b70b31eb-da80-4680-988d-7dd5d8ab1fe6-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-bdzmz\" (UID: \"b70b31eb-da80-4680-988d-7dd5d8ab1fe6\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-bdzmz" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.376238 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/be7d6fb4-1943-4d7f-87a8-ce906ed31cf0-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-bz6vn\" (UID: \"be7d6fb4-1943-4d7f-87a8-ce906ed31cf0\") " pod="openshift-authentication/oauth-openshift-558db77b4-bz6vn" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.376662 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/92095718-20ba-4b03-b949-e9a26009e283-ca-trust-extracted\") pod \"image-registry-697d97f7c8-q44w9\" (UID: \"92095718-20ba-4b03-b949-e9a26009e283\") " pod="openshift-image-registry/image-registry-697d97f7c8-q44w9" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.376938 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/9b135157-3c40-4f85-84a4-5c521ccbf1bd-etcd-serving-ca\") pod \"apiserver-76f77b778f-pb92d\" (UID: \"9b135157-3c40-4f85-84a4-5c521ccbf1bd\") " pod="openshift-apiserver/apiserver-76f77b778f-pb92d" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.377341 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/3e81d771-d34a-4a73-b0fd-0ea7e72f9e25-available-featuregates\") pod \"openshift-config-operator-7777fb866f-xh9xx\" (UID: \"3e81d771-d34a-4a73-b0fd-0ea7e72f9e25\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-xh9xx" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.377801 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/34cb015e-a7ba-4cea-a2fc-ab3b101299ce-console-serving-cert\") pod \"console-f9d7485db-m5qlx\" (UID: \"34cb015e-a7ba-4cea-a2fc-ab3b101299ce\") " pod="openshift-console/console-f9d7485db-m5qlx" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.377866 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/be7d6fb4-1943-4d7f-87a8-ce906ed31cf0-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-bz6vn\" (UID: \"be7d6fb4-1943-4d7f-87a8-ce906ed31cf0\") " pod="openshift-authentication/oauth-openshift-558db77b4-bz6vn" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.377936 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/be7d6fb4-1943-4d7f-87a8-ce906ed31cf0-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-bz6vn\" (UID: \"be7d6fb4-1943-4d7f-87a8-ce906ed31cf0\") " pod="openshift-authentication/oauth-openshift-558db77b4-bz6vn" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.378030 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/92095718-20ba-4b03-b949-e9a26009e283-registry-tls\") pod \"image-registry-697d97f7c8-q44w9\" (UID: \"92095718-20ba-4b03-b949-e9a26009e283\") " pod="openshift-image-registry/image-registry-697d97f7c8-q44w9" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.378415 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b7f45d1c-1e38-42a4-a4e6-b682787d73cd-config\") pod \"console-operator-58897d9998-vmhjk\" (UID: \"b7f45d1c-1e38-42a4-a4e6-b682787d73cd\") " pod="openshift-console-operator/console-operator-58897d9998-vmhjk" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.379151 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/4e1fed36-f837-4e3d-86fa-a4f05ef0a7e9-machine-approver-tls\") pod \"machine-approver-56656f9798-842hf\" (UID: \"4e1fed36-f837-4e3d-86fa-a4f05ef0a7e9\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-842hf" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.379425 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b135157-3c40-4f85-84a4-5c521ccbf1bd-config\") pod \"apiserver-76f77b778f-pb92d\" (UID: \"9b135157-3c40-4f85-84a4-5c521ccbf1bd\") " pod="openshift-apiserver/apiserver-76f77b778f-pb92d" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.380114 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/9b135157-3c40-4f85-84a4-5c521ccbf1bd-audit\") pod \"apiserver-76f77b778f-pb92d\" (UID: \"9b135157-3c40-4f85-84a4-5c521ccbf1bd\") " pod="openshift-apiserver/apiserver-76f77b778f-pb92d" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.380436 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5878306c-90e4-4924-b23c-90079c35d2dc-trusted-ca\") pod \"ingress-operator-5b745b69d9-4nf42\" (UID: \"5878306c-90e4-4924-b23c-90079c35d2dc\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4nf42" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.380444 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ff55314a-7f2f-42c9-9723-2fdaad791959-client-ca\") pod \"controller-manager-879f6c89f-9lb9d\" (UID: \"ff55314a-7f2f-42c9-9723-2fdaad791959\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9lb9d" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.380731 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9425edee-98a4-4086-8d31-28478469501b-config\") pod \"route-controller-manager-6576b87f9c-rz7sl\" (UID: \"9425edee-98a4-4086-8d31-28478469501b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rz7sl" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.380753 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/92095718-20ba-4b03-b949-e9a26009e283-installation-pull-secrets\") pod \"image-registry-697d97f7c8-q44w9\" (UID: \"92095718-20ba-4b03-b949-e9a26009e283\") " pod="openshift-image-registry/image-registry-697d97f7c8-q44w9" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.380872 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/cffc1277-9912-47dd-80a3-d724149e420c-encryption-config\") pod \"apiserver-7bbb656c7d-4fv46\" (UID: \"cffc1277-9912-47dd-80a3-d724149e420c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4fv46" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.381237 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/be7d6fb4-1943-4d7f-87a8-ce906ed31cf0-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-bz6vn\" (UID: \"be7d6fb4-1943-4d7f-87a8-ce906ed31cf0\") " pod="openshift-authentication/oauth-openshift-558db77b4-bz6vn" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.381392 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/34cb015e-a7ba-4cea-a2fc-ab3b101299ce-console-config\") pod \"console-f9d7485db-m5qlx\" (UID: \"34cb015e-a7ba-4cea-a2fc-ab3b101299ce\") " pod="openshift-console/console-f9d7485db-m5qlx" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.381539 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/0b2f916a-1f36-4ff0-96df-135bcfc2d5f8-srv-cert\") pod \"catalog-operator-68c6474976-dr9rm\" (UID: \"0b2f916a-1f36-4ff0-96df-135bcfc2d5f8\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-dr9rm" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.370443 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/be7d6fb4-1943-4d7f-87a8-ce906ed31cf0-audit-policies\") pod \"oauth-openshift-558db77b4-bz6vn\" (UID: \"be7d6fb4-1943-4d7f-87a8-ce906ed31cf0\") " pod="openshift-authentication/oauth-openshift-558db77b4-bz6vn" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.381641 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/be7d6fb4-1943-4d7f-87a8-ce906ed31cf0-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-bz6vn\" (UID: \"be7d6fb4-1943-4d7f-87a8-ce906ed31cf0\") " pod="openshift-authentication/oauth-openshift-558db77b4-bz6vn" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.382189 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/62aed52a-c801-4ec0-b96b-6132de0e4200-serving-cert\") pod \"authentication-operator-69f744f599-q8rmz\" (UID: \"62aed52a-c801-4ec0-b96b-6132de0e4200\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-q8rmz" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.382693 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/34cb015e-a7ba-4cea-a2fc-ab3b101299ce-oauth-serving-cert\") pod \"console-f9d7485db-m5qlx\" (UID: \"34cb015e-a7ba-4cea-a2fc-ab3b101299ce\") " pod="openshift-console/console-f9d7485db-m5qlx" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.382749 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/be7d6fb4-1943-4d7f-87a8-ce906ed31cf0-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-bz6vn\" (UID: \"be7d6fb4-1943-4d7f-87a8-ce906ed31cf0\") " pod="openshift-authentication/oauth-openshift-558db77b4-bz6vn" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.383032 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/9b135157-3c40-4f85-84a4-5c521ccbf1bd-image-import-ca\") pod \"apiserver-76f77b778f-pb92d\" (UID: \"9b135157-3c40-4f85-84a4-5c521ccbf1bd\") " pod="openshift-apiserver/apiserver-76f77b778f-pb92d" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.383179 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c456c157-fdbc-483e-b2ca-ceb0d2b6247b-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-q926p\" (UID: \"c456c157-fdbc-483e-b2ca-ceb0d2b6247b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-q926p" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.383287 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/9b135157-3c40-4f85-84a4-5c521ccbf1bd-node-pullsecrets\") pod \"apiserver-76f77b778f-pb92d\" (UID: \"9b135157-3c40-4f85-84a4-5c521ccbf1bd\") " pod="openshift-apiserver/apiserver-76f77b778f-pb92d" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.383563 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/4e1fed36-f837-4e3d-86fa-a4f05ef0a7e9-auth-proxy-config\") pod \"machine-approver-56656f9798-842hf\" (UID: \"4e1fed36-f837-4e3d-86fa-a4f05ef0a7e9\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-842hf" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.383596 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/cffc1277-9912-47dd-80a3-d724149e420c-audit-dir\") pod \"apiserver-7bbb656c7d-4fv46\" (UID: \"cffc1277-9912-47dd-80a3-d724149e420c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4fv46" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.383756 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/9b135157-3c40-4f85-84a4-5c521ccbf1bd-etcd-client\") pod \"apiserver-76f77b778f-pb92d\" (UID: \"9b135157-3c40-4f85-84a4-5c521ccbf1bd\") " pod="openshift-apiserver/apiserver-76f77b778f-pb92d" Oct 11 03:55:51 crc kubenswrapper[4703]: E1011 03:55:51.383792 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-11 03:55:51.883776035 +0000 UTC m=+43.094258037 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q44w9" (UID: "92095718-20ba-4b03-b949-e9a26009e283") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.383876 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/be7d6fb4-1943-4d7f-87a8-ce906ed31cf0-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-bz6vn\" (UID: \"be7d6fb4-1943-4d7f-87a8-ce906ed31cf0\") " pod="openshift-authentication/oauth-openshift-558db77b4-bz6vn" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.384511 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/cffc1277-9912-47dd-80a3-d724149e420c-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-4fv46\" (UID: \"cffc1277-9912-47dd-80a3-d724149e420c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4fv46" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.384666 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9b135157-3c40-4f85-84a4-5c521ccbf1bd-trusted-ca-bundle\") pod \"apiserver-76f77b778f-pb92d\" (UID: \"9b135157-3c40-4f85-84a4-5c521ccbf1bd\") " pod="openshift-apiserver/apiserver-76f77b778f-pb92d" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.384740 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3e81d771-d34a-4a73-b0fd-0ea7e72f9e25-serving-cert\") pod \"openshift-config-operator-7777fb866f-xh9xx\" (UID: \"3e81d771-d34a-4a73-b0fd-0ea7e72f9e25\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-xh9xx" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.384911 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b7f45d1c-1e38-42a4-a4e6-b682787d73cd-serving-cert\") pod \"console-operator-58897d9998-vmhjk\" (UID: \"b7f45d1c-1e38-42a4-a4e6-b682787d73cd\") " pod="openshift-console-operator/console-operator-58897d9998-vmhjk" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.385388 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b7f45d1c-1e38-42a4-a4e6-b682787d73cd-trusted-ca\") pod \"console-operator-58897d9998-vmhjk\" (UID: \"b7f45d1c-1e38-42a4-a4e6-b682787d73cd\") " pod="openshift-console-operator/console-operator-58897d9998-vmhjk" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.385867 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7580d2bf-4b66-4dfb-9693-ebcc64225c58-config\") pod \"kube-controller-manager-operator-78b949d7b-bcgcx\" (UID: \"7580d2bf-4b66-4dfb-9693-ebcc64225c58\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-bcgcx" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.386238 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9b135157-3c40-4f85-84a4-5c521ccbf1bd-serving-cert\") pod \"apiserver-76f77b778f-pb92d\" (UID: \"9b135157-3c40-4f85-84a4-5c521ccbf1bd\") " pod="openshift-apiserver/apiserver-76f77b778f-pb92d" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.387358 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0a9d5e17-f5b5-4f0e-81e2-4edbe41c4e9f-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-w79s7\" (UID: \"0a9d5e17-f5b5-4f0e-81e2-4edbe41c4e9f\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-w79s7" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.388039 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/cffc1277-9912-47dd-80a3-d724149e420c-etcd-client\") pod \"apiserver-7bbb656c7d-4fv46\" (UID: \"cffc1277-9912-47dd-80a3-d724149e420c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4fv46" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.388103 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/34cb015e-a7ba-4cea-a2fc-ab3b101299ce-trusted-ca-bundle\") pod \"console-f9d7485db-m5qlx\" (UID: \"34cb015e-a7ba-4cea-a2fc-ab3b101299ce\") " pod="openshift-console/console-f9d7485db-m5qlx" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.388576 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e25f5a36-9626-41eb-82b0-6f33dfaf3001-metrics-tls\") pod \"dns-operator-744455d44c-6rj5d\" (UID: \"e25f5a36-9626-41eb-82b0-6f33dfaf3001\") " pod="openshift-dns-operator/dns-operator-744455d44c-6rj5d" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.389115 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9425edee-98a4-4086-8d31-28478469501b-serving-cert\") pod \"route-controller-manager-6576b87f9c-rz7sl\" (UID: \"9425edee-98a4-4086-8d31-28478469501b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rz7sl" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.389116 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/be7d6fb4-1943-4d7f-87a8-ce906ed31cf0-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-bz6vn\" (UID: \"be7d6fb4-1943-4d7f-87a8-ce906ed31cf0\") " pod="openshift-authentication/oauth-openshift-558db77b4-bz6vn" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.391797 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/0b2f916a-1f36-4ff0-96df-135bcfc2d5f8-profile-collector-cert\") pod \"catalog-operator-68c6474976-dr9rm\" (UID: \"0b2f916a-1f36-4ff0-96df-135bcfc2d5f8\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-dr9rm" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.391953 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/34cb015e-a7ba-4cea-a2fc-ab3b101299ce-console-oauth-config\") pod \"console-f9d7485db-m5qlx\" (UID: \"34cb015e-a7ba-4cea-a2fc-ab3b101299ce\") " pod="openshift-console/console-f9d7485db-m5qlx" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.392033 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5878306c-90e4-4924-b23c-90079c35d2dc-metrics-tls\") pod \"ingress-operator-5b745b69d9-4nf42\" (UID: \"5878306c-90e4-4924-b23c-90079c35d2dc\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4nf42" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.392554 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/c456c157-fdbc-483e-b2ca-ceb0d2b6247b-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-q926p\" (UID: \"c456c157-fdbc-483e-b2ca-ceb0d2b6247b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-q926p" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.392829 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/be7d6fb4-1943-4d7f-87a8-ce906ed31cf0-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-bz6vn\" (UID: \"be7d6fb4-1943-4d7f-87a8-ce906ed31cf0\") " pod="openshift-authentication/oauth-openshift-558db77b4-bz6vn" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.394285 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/be037531-a087-4b3e-9327-f28d38fd3961-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-h9vv2\" (UID: \"be037531-a087-4b3e-9327-f28d38fd3961\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-h9vv2" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.411411 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t8qr8\" (UniqueName: \"kubernetes.io/projected/ff55314a-7f2f-42c9-9723-2fdaad791959-kube-api-access-t8qr8\") pod \"controller-manager-879f6c89f-9lb9d\" (UID: \"ff55314a-7f2f-42c9-9723-2fdaad791959\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9lb9d" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.414859 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-nt5pr" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.421240 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pktv9\" (UniqueName: \"kubernetes.io/projected/62aed52a-c801-4ec0-b96b-6132de0e4200-kube-api-access-pktv9\") pod \"authentication-operator-69f744f599-q8rmz\" (UID: \"62aed52a-c801-4ec0-b96b-6132de0e4200\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-q8rmz" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.451432 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sdbzq\" (UniqueName: \"kubernetes.io/projected/e25f5a36-9626-41eb-82b0-6f33dfaf3001-kube-api-access-sdbzq\") pod \"dns-operator-744455d44c-6rj5d\" (UID: \"e25f5a36-9626-41eb-82b0-6f33dfaf3001\") " pod="openshift-dns-operator/dns-operator-744455d44c-6rj5d" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.465199 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7tg67\" (UniqueName: \"kubernetes.io/projected/3e81d771-d34a-4a73-b0fd-0ea7e72f9e25-kube-api-access-7tg67\") pod \"openshift-config-operator-7777fb866f-xh9xx\" (UID: \"3e81d771-d34a-4a73-b0fd-0ea7e72f9e25\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-xh9xx" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.472655 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 11 03:55:51 crc kubenswrapper[4703]: E1011 03:55:51.472926 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-11 03:55:51.972892625 +0000 UTC m=+43.183374577 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.472985 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/3516a340-0474-422a-bc3c-4424e08c17b0-stats-auth\") pod \"router-default-5444994796-mlb5q\" (UID: \"3516a340-0474-422a-bc3c-4424e08c17b0\") " pod="openshift-ingress/router-default-5444994796-mlb5q" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.473036 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/01618421-523c-44ab-b631-f2624ba8ab2d-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-69n87\" (UID: \"01618421-523c-44ab-b631-f2624ba8ab2d\") " pod="openshift-multus/cni-sysctl-allowlist-ds-69n87" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.473079 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q44w9\" (UID: \"92095718-20ba-4b03-b949-e9a26009e283\") " pod="openshift-image-registry/image-registry-697d97f7c8-q44w9" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.473109 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/886a9919-144b-4a33-81ac-90b79512aa6a-proxy-tls\") pod \"machine-config-operator-74547568cd-b5mqn\" (UID: \"886a9919-144b-4a33-81ac-90b79512aa6a\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-b5mqn" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.473143 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3516a340-0474-422a-bc3c-4424e08c17b0-metrics-certs\") pod \"router-default-5444994796-mlb5q\" (UID: \"3516a340-0474-422a-bc3c-4424e08c17b0\") " pod="openshift-ingress/router-default-5444994796-mlb5q" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.473211 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/5938c6ea-4d1a-4ca7-bbc3-0bf2b1e9e6bb-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-hrr65\" (UID: \"5938c6ea-4d1a-4ca7-bbc3-0bf2b1e9e6bb\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-hrr65" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.473245 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2c0331e0-7b42-4227-8d50-d4c66a927dea-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-b5vqb\" (UID: \"2c0331e0-7b42-4227-8d50-d4c66a927dea\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-b5vqb" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.473291 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/dfc1634d-27c3-4dd8-a8b4-0d9177961ee9-etcd-ca\") pod \"etcd-operator-b45778765-dfgkf\" (UID: \"dfc1634d-27c3-4dd8-a8b4-0d9177961ee9\") " pod="openshift-etcd-operator/etcd-operator-b45778765-dfgkf" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.473319 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5b7cae24-659a-4bda-b925-d74698801dd9-config-volume\") pod \"dns-default-rf7jf\" (UID: \"5b7cae24-659a-4bda-b925-d74698801dd9\") " pod="openshift-dns/dns-default-rf7jf" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.473352 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wd47f\" (UniqueName: \"kubernetes.io/projected/886a9919-144b-4a33-81ac-90b79512aa6a-kube-api-access-wd47f\") pod \"machine-config-operator-74547568cd-b5mqn\" (UID: \"886a9919-144b-4a33-81ac-90b79512aa6a\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-b5mqn" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.473382 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tbzsb\" (UniqueName: \"kubernetes.io/projected/83320069-4d0c-4e40-8a7f-b3bc4beda7a1-kube-api-access-tbzsb\") pod \"collect-profiles-29335905-4jslx\" (UID: \"83320069-4d0c-4e40-8a7f-b3bc4beda7a1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29335905-4jslx" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.473430 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nx4ks\" (UniqueName: \"kubernetes.io/projected/5938c6ea-4d1a-4ca7-bbc3-0bf2b1e9e6bb-kube-api-access-nx4ks\") pod \"multus-admission-controller-857f4d67dd-hrr65\" (UID: \"5938c6ea-4d1a-4ca7-bbc3-0bf2b1e9e6bb\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-hrr65" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.473485 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/01618421-523c-44ab-b631-f2624ba8ab2d-ready\") pod \"cni-sysctl-allowlist-ds-69n87\" (UID: \"01618421-523c-44ab-b631-f2624ba8ab2d\") " pod="openshift-multus/cni-sysctl-allowlist-ds-69n87" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.473517 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4qm7z\" (UniqueName: \"kubernetes.io/projected/fd6eb215-d833-4726-b13c-54c53ca1e7e5-kube-api-access-4qm7z\") pod \"package-server-manager-789f6589d5-75knh\" (UID: \"fd6eb215-d833-4726-b13c-54c53ca1e7e5\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-75knh" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.473557 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dfc1634d-27c3-4dd8-a8b4-0d9177961ee9-serving-cert\") pod \"etcd-operator-b45778765-dfgkf\" (UID: \"dfc1634d-27c3-4dd8-a8b4-0d9177961ee9\") " pod="openshift-etcd-operator/etcd-operator-b45778765-dfgkf" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.473597 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/59fc390f-24d6-4f16-855b-475f50c6f4b0-mountpoint-dir\") pod \"csi-hostpathplugin-728m8\" (UID: \"59fc390f-24d6-4f16-855b-475f50c6f4b0\") " pod="hostpath-provisioner/csi-hostpathplugin-728m8" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.473635 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/83320069-4d0c-4e40-8a7f-b3bc4beda7a1-config-volume\") pod \"collect-profiles-29335905-4jslx\" (UID: \"83320069-4d0c-4e40-8a7f-b3bc4beda7a1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29335905-4jslx" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.473670 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3516a340-0474-422a-bc3c-4424e08c17b0-service-ca-bundle\") pod \"router-default-5444994796-mlb5q\" (UID: \"3516a340-0474-422a-bc3c-4424e08c17b0\") " pod="openshift-ingress/router-default-5444994796-mlb5q" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.473710 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/291efcc3-da30-4a56-855c-8ee24e840b26-signing-cabundle\") pod \"service-ca-9c57cc56f-8wq6t\" (UID: \"291efcc3-da30-4a56-855c-8ee24e840b26\") " pod="openshift-service-ca/service-ca-9c57cc56f-8wq6t" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.473745 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/dfc1634d-27c3-4dd8-a8b4-0d9177961ee9-etcd-service-ca\") pod \"etcd-operator-b45778765-dfgkf\" (UID: \"dfc1634d-27c3-4dd8-a8b4-0d9177961ee9\") " pod="openshift-etcd-operator/etcd-operator-b45778765-dfgkf" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.473806 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/beb46851-cb80-4329-85b4-c2c87ebeb836-webhook-cert\") pod \"packageserver-d55dfcdfc-vmsw5\" (UID: \"beb46851-cb80-4329-85b4-c2c87ebeb836\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vmsw5" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.473835 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/dfc1634d-27c3-4dd8-a8b4-0d9177961ee9-etcd-client\") pod \"etcd-operator-b45778765-dfgkf\" (UID: \"dfc1634d-27c3-4dd8-a8b4-0d9177961ee9\") " pod="openshift-etcd-operator/etcd-operator-b45778765-dfgkf" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.473862 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/59fc390f-24d6-4f16-855b-475f50c6f4b0-registration-dir\") pod \"csi-hostpathplugin-728m8\" (UID: \"59fc390f-24d6-4f16-855b-475f50c6f4b0\") " pod="hostpath-provisioner/csi-hostpathplugin-728m8" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.473901 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q86hk\" (UniqueName: \"kubernetes.io/projected/5b7cae24-659a-4bda-b925-d74698801dd9-kube-api-access-q86hk\") pod \"dns-default-rf7jf\" (UID: \"5b7cae24-659a-4bda-b925-d74698801dd9\") " pod="openshift-dns/dns-default-rf7jf" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.473930 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kn4n5\" (UniqueName: \"kubernetes.io/projected/2c0331e0-7b42-4227-8d50-d4c66a927dea-kube-api-access-kn4n5\") pod \"kube-storage-version-migrator-operator-b67b599dd-b5vqb\" (UID: \"2c0331e0-7b42-4227-8d50-d4c66a927dea\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-b5vqb" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.473973 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/0b4045ea-f5af-4576-9ada-baddba2cc319-srv-cert\") pod \"olm-operator-6b444d44fb-z7wfl\" (UID: \"0b4045ea-f5af-4576-9ada-baddba2cc319\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-z7wfl" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.474001 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ssw2c\" (UniqueName: \"kubernetes.io/projected/beb46851-cb80-4329-85b4-c2c87ebeb836-kube-api-access-ssw2c\") pod \"packageserver-d55dfcdfc-vmsw5\" (UID: \"beb46851-cb80-4329-85b4-c2c87ebeb836\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vmsw5" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.474054 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8fa6f46d-74a5-4e8b-b112-55ed455b5409-config\") pod \"service-ca-operator-777779d784-psx9s\" (UID: \"8fa6f46d-74a5-4e8b-b112-55ed455b5409\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-psx9s" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.474107 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b8bhl\" (UniqueName: \"kubernetes.io/projected/8fa6f46d-74a5-4e8b-b112-55ed455b5409-kube-api-access-b8bhl\") pod \"service-ca-operator-777779d784-psx9s\" (UID: \"8fa6f46d-74a5-4e8b-b112-55ed455b5409\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-psx9s" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.474137 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/01e5cd3c-edc8-486e-9b3c-9ec1ca513fcc-certs\") pod \"machine-config-server-6588z\" (UID: \"01e5cd3c-edc8-486e-9b3c-9ec1ca513fcc\") " pod="openshift-machine-config-operator/machine-config-server-6588z" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.474165 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/beb46851-cb80-4329-85b4-c2c87ebeb836-tmpfs\") pod \"packageserver-d55dfcdfc-vmsw5\" (UID: \"beb46851-cb80-4329-85b4-c2c87ebeb836\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vmsw5" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.474213 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/886a9919-144b-4a33-81ac-90b79512aa6a-images\") pod \"machine-config-operator-74547568cd-b5mqn\" (UID: \"886a9919-144b-4a33-81ac-90b79512aa6a\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-b5mqn" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.474319 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-twlw5\" (UniqueName: \"kubernetes.io/projected/0b4045ea-f5af-4576-9ada-baddba2cc319-kube-api-access-twlw5\") pod \"olm-operator-6b444d44fb-z7wfl\" (UID: \"0b4045ea-f5af-4576-9ada-baddba2cc319\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-z7wfl" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.474361 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/59fc390f-24d6-4f16-855b-475f50c6f4b0-socket-dir\") pod \"csi-hostpathplugin-728m8\" (UID: \"59fc390f-24d6-4f16-855b-475f50c6f4b0\") " pod="hostpath-provisioner/csi-hostpathplugin-728m8" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.474398 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c7664ffa-e88d-468b-b43d-3b3b6ec5195b-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-tjd96\" (UID: \"c7664ffa-e88d-468b-b43d-3b3b6ec5195b\") " pod="openshift-marketplace/marketplace-operator-79b997595-tjd96" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.474433 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/34d64762-444d-41a0-bf17-d8de6712a90f-proxy-tls\") pod \"machine-config-controller-84d6567774-4v67m\" (UID: \"34d64762-444d-41a0-bf17-d8de6712a90f\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-4v67m" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.474504 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/291efcc3-da30-4a56-855c-8ee24e840b26-signing-key\") pod \"service-ca-9c57cc56f-8wq6t\" (UID: \"291efcc3-da30-4a56-855c-8ee24e840b26\") " pod="openshift-service-ca/service-ca-9c57cc56f-8wq6t" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.474538 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/95f6c15c-5422-4ef6-a4a8-108959afcae6-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-8wrsq\" (UID: \"95f6c15c-5422-4ef6-a4a8-108959afcae6\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-8wrsq" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.474570 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/fd6eb215-d833-4726-b13c-54c53ca1e7e5-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-75knh\" (UID: \"fd6eb215-d833-4726-b13c-54c53ca1e7e5\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-75knh" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.474602 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nb2qg\" (UniqueName: \"kubernetes.io/projected/34d64762-444d-41a0-bf17-d8de6712a90f-kube-api-access-nb2qg\") pod \"machine-config-controller-84d6567774-4v67m\" (UID: \"34d64762-444d-41a0-bf17-d8de6712a90f\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-4v67m" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.474631 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/01618421-523c-44ab-b631-f2624ba8ab2d-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-69n87\" (UID: \"01618421-523c-44ab-b631-f2624ba8ab2d\") " pod="openshift-multus/cni-sysctl-allowlist-ds-69n87" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.474666 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ncmqc\" (UniqueName: \"kubernetes.io/projected/59fc390f-24d6-4f16-855b-475f50c6f4b0-kube-api-access-ncmqc\") pod \"csi-hostpathplugin-728m8\" (UID: \"59fc390f-24d6-4f16-855b-475f50c6f4b0\") " pod="hostpath-provisioner/csi-hostpathplugin-728m8" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.474696 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/59fc390f-24d6-4f16-855b-475f50c6f4b0-csi-data-dir\") pod \"csi-hostpathplugin-728m8\" (UID: \"59fc390f-24d6-4f16-855b-475f50c6f4b0\") " pod="hostpath-provisioner/csi-hostpathplugin-728m8" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.474725 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c0331e0-7b42-4227-8d50-d4c66a927dea-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-b5vqb\" (UID: \"2c0331e0-7b42-4227-8d50-d4c66a927dea\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-b5vqb" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.474766 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/59fc390f-24d6-4f16-855b-475f50c6f4b0-plugins-dir\") pod \"csi-hostpathplugin-728m8\" (UID: \"59fc390f-24d6-4f16-855b-475f50c6f4b0\") " pod="hostpath-provisioner/csi-hostpathplugin-728m8" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.474796 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sh7lt\" (UniqueName: \"kubernetes.io/projected/dfc1634d-27c3-4dd8-a8b4-0d9177961ee9-kube-api-access-sh7lt\") pod \"etcd-operator-b45778765-dfgkf\" (UID: \"dfc1634d-27c3-4dd8-a8b4-0d9177961ee9\") " pod="openshift-etcd-operator/etcd-operator-b45778765-dfgkf" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.474826 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/0b4045ea-f5af-4576-9ada-baddba2cc319-profile-collector-cert\") pod \"olm-operator-6b444d44fb-z7wfl\" (UID: \"0b4045ea-f5af-4576-9ada-baddba2cc319\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-z7wfl" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.474860 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/886a9919-144b-4a33-81ac-90b79512aa6a-auth-proxy-config\") pod \"machine-config-operator-74547568cd-b5mqn\" (UID: \"886a9919-144b-4a33-81ac-90b79512aa6a\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-b5mqn" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.474900 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dfc1634d-27c3-4dd8-a8b4-0d9177961ee9-config\") pod \"etcd-operator-b45778765-dfgkf\" (UID: \"dfc1634d-27c3-4dd8-a8b4-0d9177961ee9\") " pod="openshift-etcd-operator/etcd-operator-b45778765-dfgkf" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.474932 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l6ms5\" (UniqueName: \"kubernetes.io/projected/95f6c15c-5422-4ef6-a4a8-108959afcae6-kube-api-access-l6ms5\") pod \"control-plane-machine-set-operator-78cbb6b69f-8wrsq\" (UID: \"95f6c15c-5422-4ef6-a4a8-108959afcae6\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-8wrsq" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.474964 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-znkwt\" (UniqueName: \"kubernetes.io/projected/4e2ba5fe-e94f-4d40-8596-d4b9ad863215-kube-api-access-znkwt\") pod \"ingress-canary-2jlb9\" (UID: \"4e2ba5fe-e94f-4d40-8596-d4b9ad863215\") " pod="openshift-ingress-canary/ingress-canary-2jlb9" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.474995 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8fa6f46d-74a5-4e8b-b112-55ed455b5409-serving-cert\") pod \"service-ca-operator-777779d784-psx9s\" (UID: \"8fa6f46d-74a5-4e8b-b112-55ed455b5409\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-psx9s" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.475029 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/34d64762-444d-41a0-bf17-d8de6712a90f-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-4v67m\" (UID: \"34d64762-444d-41a0-bf17-d8de6712a90f\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-4v67m" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.475078 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cst4c\" (UniqueName: \"kubernetes.io/projected/291efcc3-da30-4a56-855c-8ee24e840b26-kube-api-access-cst4c\") pod \"service-ca-9c57cc56f-8wq6t\" (UID: \"291efcc3-da30-4a56-855c-8ee24e840b26\") " pod="openshift-service-ca/service-ca-9c57cc56f-8wq6t" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.475108 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/01e5cd3c-edc8-486e-9b3c-9ec1ca513fcc-node-bootstrap-token\") pod \"machine-config-server-6588z\" (UID: \"01e5cd3c-edc8-486e-9b3c-9ec1ca513fcc\") " pod="openshift-machine-config-operator/machine-config-server-6588z" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.475154 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-55tmj\" (UniqueName: \"kubernetes.io/projected/01e5cd3c-edc8-486e-9b3c-9ec1ca513fcc-kube-api-access-55tmj\") pod \"machine-config-server-6588z\" (UID: \"01e5cd3c-edc8-486e-9b3c-9ec1ca513fcc\") " pod="openshift-machine-config-operator/machine-config-server-6588z" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.475183 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wxrkv\" (UniqueName: \"kubernetes.io/projected/3516a340-0474-422a-bc3c-4424e08c17b0-kube-api-access-wxrkv\") pod \"router-default-5444994796-mlb5q\" (UID: \"3516a340-0474-422a-bc3c-4424e08c17b0\") " pod="openshift-ingress/router-default-5444994796-mlb5q" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.475212 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4e2ba5fe-e94f-4d40-8596-d4b9ad863215-cert\") pod \"ingress-canary-2jlb9\" (UID: \"4e2ba5fe-e94f-4d40-8596-d4b9ad863215\") " pod="openshift-ingress-canary/ingress-canary-2jlb9" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.475265 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/83320069-4d0c-4e40-8a7f-b3bc4beda7a1-secret-volume\") pod \"collect-profiles-29335905-4jslx\" (UID: \"83320069-4d0c-4e40-8a7f-b3bc4beda7a1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29335905-4jslx" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.475323 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/beb46851-cb80-4329-85b4-c2c87ebeb836-apiservice-cert\") pod \"packageserver-d55dfcdfc-vmsw5\" (UID: \"beb46851-cb80-4329-85b4-c2c87ebeb836\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vmsw5" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.475366 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/c7664ffa-e88d-468b-b43d-3b3b6ec5195b-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-tjd96\" (UID: \"c7664ffa-e88d-468b-b43d-3b3b6ec5195b\") " pod="openshift-marketplace/marketplace-operator-79b997595-tjd96" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.475399 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5b7cae24-659a-4bda-b925-d74698801dd9-metrics-tls\") pod \"dns-default-rf7jf\" (UID: \"5b7cae24-659a-4bda-b925-d74698801dd9\") " pod="openshift-dns/dns-default-rf7jf" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.475435 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/3516a340-0474-422a-bc3c-4424e08c17b0-default-certificate\") pod \"router-default-5444994796-mlb5q\" (UID: \"3516a340-0474-422a-bc3c-4424e08c17b0\") " pod="openshift-ingress/router-default-5444994796-mlb5q" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.475488 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qtg94\" (UniqueName: \"kubernetes.io/projected/c7664ffa-e88d-468b-b43d-3b3b6ec5195b-kube-api-access-qtg94\") pod \"marketplace-operator-79b997595-tjd96\" (UID: \"c7664ffa-e88d-468b-b43d-3b3b6ec5195b\") " pod="openshift-marketplace/marketplace-operator-79b997595-tjd96" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.475520 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fc6bj\" (UniqueName: \"kubernetes.io/projected/01618421-523c-44ab-b631-f2624ba8ab2d-kube-api-access-fc6bj\") pod \"cni-sysctl-allowlist-ds-69n87\" (UID: \"01618421-523c-44ab-b631-f2624ba8ab2d\") " pod="openshift-multus/cni-sysctl-allowlist-ds-69n87" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.476538 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c0331e0-7b42-4227-8d50-d4c66a927dea-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-b5vqb\" (UID: \"2c0331e0-7b42-4227-8d50-d4c66a927dea\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-b5vqb" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.476711 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/886a9919-144b-4a33-81ac-90b79512aa6a-auth-proxy-config\") pod \"machine-config-operator-74547568cd-b5mqn\" (UID: \"886a9919-144b-4a33-81ac-90b79512aa6a\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-b5mqn" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.476971 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/59fc390f-24d6-4f16-855b-475f50c6f4b0-plugins-dir\") pod \"csi-hostpathplugin-728m8\" (UID: \"59fc390f-24d6-4f16-855b-475f50c6f4b0\") " pod="hostpath-provisioner/csi-hostpathplugin-728m8" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.477847 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/34d64762-444d-41a0-bf17-d8de6712a90f-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-4v67m\" (UID: \"34d64762-444d-41a0-bf17-d8de6712a90f\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-4v67m" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.478756 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dfc1634d-27c3-4dd8-a8b4-0d9177961ee9-config\") pod \"etcd-operator-b45778765-dfgkf\" (UID: \"dfc1634d-27c3-4dd8-a8b4-0d9177961ee9\") " pod="openshift-etcd-operator/etcd-operator-b45778765-dfgkf" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.479103 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/3516a340-0474-422a-bc3c-4424e08c17b0-stats-auth\") pod \"router-default-5444994796-mlb5q\" (UID: \"3516a340-0474-422a-bc3c-4424e08c17b0\") " pod="openshift-ingress/router-default-5444994796-mlb5q" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.479148 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/34d64762-444d-41a0-bf17-d8de6712a90f-proxy-tls\") pod \"machine-config-controller-84d6567774-4v67m\" (UID: \"34d64762-444d-41a0-bf17-d8de6712a90f\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-4v67m" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.479646 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/dfc1634d-27c3-4dd8-a8b4-0d9177961ee9-etcd-ca\") pod \"etcd-operator-b45778765-dfgkf\" (UID: \"dfc1634d-27c3-4dd8-a8b4-0d9177961ee9\") " pod="openshift-etcd-operator/etcd-operator-b45778765-dfgkf" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.480202 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5b7cae24-659a-4bda-b925-d74698801dd9-config-volume\") pod \"dns-default-rf7jf\" (UID: \"5b7cae24-659a-4bda-b925-d74698801dd9\") " pod="openshift-dns/dns-default-rf7jf" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.480837 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/01618421-523c-44ab-b631-f2624ba8ab2d-ready\") pod \"cni-sysctl-allowlist-ds-69n87\" (UID: \"01618421-523c-44ab-b631-f2624ba8ab2d\") " pod="openshift-multus/cni-sysctl-allowlist-ds-69n87" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.481090 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/beb46851-cb80-4329-85b4-c2c87ebeb836-tmpfs\") pod \"packageserver-d55dfcdfc-vmsw5\" (UID: \"beb46851-cb80-4329-85b4-c2c87ebeb836\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vmsw5" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.481527 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/0b4045ea-f5af-4576-9ada-baddba2cc319-profile-collector-cert\") pod \"olm-operator-6b444d44fb-z7wfl\" (UID: \"0b4045ea-f5af-4576-9ada-baddba2cc319\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-z7wfl" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.481706 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-6rj5d" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.482153 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/59fc390f-24d6-4f16-855b-475f50c6f4b0-socket-dir\") pod \"csi-hostpathplugin-728m8\" (UID: \"59fc390f-24d6-4f16-855b-475f50c6f4b0\") " pod="hostpath-provisioner/csi-hostpathplugin-728m8" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.482301 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/59fc390f-24d6-4f16-855b-475f50c6f4b0-registration-dir\") pod \"csi-hostpathplugin-728m8\" (UID: \"59fc390f-24d6-4f16-855b-475f50c6f4b0\") " pod="hostpath-provisioner/csi-hostpathplugin-728m8" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.482619 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/886a9919-144b-4a33-81ac-90b79512aa6a-images\") pod \"machine-config-operator-74547568cd-b5mqn\" (UID: \"886a9919-144b-4a33-81ac-90b79512aa6a\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-b5mqn" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.483079 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/01e5cd3c-edc8-486e-9b3c-9ec1ca513fcc-certs\") pod \"machine-config-server-6588z\" (UID: \"01e5cd3c-edc8-486e-9b3c-9ec1ca513fcc\") " pod="openshift-machine-config-operator/machine-config-server-6588z" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.483412 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/dfc1634d-27c3-4dd8-a8b4-0d9177961ee9-etcd-service-ca\") pod \"etcd-operator-b45778765-dfgkf\" (UID: \"dfc1634d-27c3-4dd8-a8b4-0d9177961ee9\") " pod="openshift-etcd-operator/etcd-operator-b45778765-dfgkf" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.483482 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/59fc390f-24d6-4f16-855b-475f50c6f4b0-mountpoint-dir\") pod \"csi-hostpathplugin-728m8\" (UID: \"59fc390f-24d6-4f16-855b-475f50c6f4b0\") " pod="hostpath-provisioner/csi-hostpathplugin-728m8" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.484224 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/83320069-4d0c-4e40-8a7f-b3bc4beda7a1-config-volume\") pod \"collect-profiles-29335905-4jslx\" (UID: \"83320069-4d0c-4e40-8a7f-b3bc4beda7a1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29335905-4jslx" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.484912 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3516a340-0474-422a-bc3c-4424e08c17b0-service-ca-bundle\") pod \"router-default-5444994796-mlb5q\" (UID: \"3516a340-0474-422a-bc3c-4424e08c17b0\") " pod="openshift-ingress/router-default-5444994796-mlb5q" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.484961 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8fa6f46d-74a5-4e8b-b112-55ed455b5409-serving-cert\") pod \"service-ca-operator-777779d784-psx9s\" (UID: \"8fa6f46d-74a5-4e8b-b112-55ed455b5409\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-psx9s" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.485062 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/01618421-523c-44ab-b631-f2624ba8ab2d-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-69n87\" (UID: \"01618421-523c-44ab-b631-f2624ba8ab2d\") " pod="openshift-multus/cni-sysctl-allowlist-ds-69n87" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.485648 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/291efcc3-da30-4a56-855c-8ee24e840b26-signing-cabundle\") pod \"service-ca-9c57cc56f-8wq6t\" (UID: \"291efcc3-da30-4a56-855c-8ee24e840b26\") " pod="openshift-service-ca/service-ca-9c57cc56f-8wq6t" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.485910 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2c0331e0-7b42-4227-8d50-d4c66a927dea-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-b5vqb\" (UID: \"2c0331e0-7b42-4227-8d50-d4c66a927dea\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-b5vqb" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.486561 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/fd6eb215-d833-4726-b13c-54c53ca1e7e5-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-75knh\" (UID: \"fd6eb215-d833-4726-b13c-54c53ca1e7e5\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-75knh" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.486609 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/beb46851-cb80-4329-85b4-c2c87ebeb836-apiservice-cert\") pod \"packageserver-d55dfcdfc-vmsw5\" (UID: \"beb46851-cb80-4329-85b4-c2c87ebeb836\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vmsw5" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.487530 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/01e5cd3c-edc8-486e-9b3c-9ec1ca513fcc-node-bootstrap-token\") pod \"machine-config-server-6588z\" (UID: \"01e5cd3c-edc8-486e-9b3c-9ec1ca513fcc\") " pod="openshift-machine-config-operator/machine-config-server-6588z" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.487945 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c7664ffa-e88d-468b-b43d-3b3b6ec5195b-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-tjd96\" (UID: \"c7664ffa-e88d-468b-b43d-3b3b6ec5195b\") " pod="openshift-marketplace/marketplace-operator-79b997595-tjd96" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.488628 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/59fc390f-24d6-4f16-855b-475f50c6f4b0-csi-data-dir\") pod \"csi-hostpathplugin-728m8\" (UID: \"59fc390f-24d6-4f16-855b-475f50c6f4b0\") " pod="hostpath-provisioner/csi-hostpathplugin-728m8" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.489175 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/dfc1634d-27c3-4dd8-a8b4-0d9177961ee9-etcd-client\") pod \"etcd-operator-b45778765-dfgkf\" (UID: \"dfc1634d-27c3-4dd8-a8b4-0d9177961ee9\") " pod="openshift-etcd-operator/etcd-operator-b45778765-dfgkf" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.489659 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0a9d5e17-f5b5-4f0e-81e2-4edbe41c4e9f-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-w79s7\" (UID: \"0a9d5e17-f5b5-4f0e-81e2-4edbe41c4e9f\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-w79s7" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.489717 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/0b4045ea-f5af-4576-9ada-baddba2cc319-srv-cert\") pod \"olm-operator-6b444d44fb-z7wfl\" (UID: \"0b4045ea-f5af-4576-9ada-baddba2cc319\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-z7wfl" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.489875 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dfc1634d-27c3-4dd8-a8b4-0d9177961ee9-serving-cert\") pod \"etcd-operator-b45778765-dfgkf\" (UID: \"dfc1634d-27c3-4dd8-a8b4-0d9177961ee9\") " pod="openshift-etcd-operator/etcd-operator-b45778765-dfgkf" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.490249 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/95f6c15c-5422-4ef6-a4a8-108959afcae6-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-8wrsq\" (UID: \"95f6c15c-5422-4ef6-a4a8-108959afcae6\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-8wrsq" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.490504 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/01618421-523c-44ab-b631-f2624ba8ab2d-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-69n87\" (UID: \"01618421-523c-44ab-b631-f2624ba8ab2d\") " pod="openshift-multus/cni-sysctl-allowlist-ds-69n87" Oct 11 03:55:51 crc kubenswrapper[4703]: E1011 03:55:51.490575 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-11 03:55:51.990553064 +0000 UTC m=+43.201035006 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q44w9" (UID: "92095718-20ba-4b03-b949-e9a26009e283") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.491270 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8fa6f46d-74a5-4e8b-b112-55ed455b5409-config\") pod \"service-ca-operator-777779d784-psx9s\" (UID: \"8fa6f46d-74a5-4e8b-b112-55ed455b5409\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-psx9s" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.491301 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/5938c6ea-4d1a-4ca7-bbc3-0bf2b1e9e6bb-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-hrr65\" (UID: \"5938c6ea-4d1a-4ca7-bbc3-0bf2b1e9e6bb\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-hrr65" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.491832 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/c7664ffa-e88d-468b-b43d-3b3b6ec5195b-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-tjd96\" (UID: \"c7664ffa-e88d-468b-b43d-3b3b6ec5195b\") " pod="openshift-marketplace/marketplace-operator-79b997595-tjd96" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.493164 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/886a9919-144b-4a33-81ac-90b79512aa6a-proxy-tls\") pod \"machine-config-operator-74547568cd-b5mqn\" (UID: \"886a9919-144b-4a33-81ac-90b79512aa6a\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-b5mqn" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.493835 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5b7cae24-659a-4bda-b925-d74698801dd9-metrics-tls\") pod \"dns-default-rf7jf\" (UID: \"5b7cae24-659a-4bda-b925-d74698801dd9\") " pod="openshift-dns/dns-default-rf7jf" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.494161 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3516a340-0474-422a-bc3c-4424e08c17b0-metrics-certs\") pod \"router-default-5444994796-mlb5q\" (UID: \"3516a340-0474-422a-bc3c-4424e08c17b0\") " pod="openshift-ingress/router-default-5444994796-mlb5q" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.494869 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/3516a340-0474-422a-bc3c-4424e08c17b0-default-certificate\") pod \"router-default-5444994796-mlb5q\" (UID: \"3516a340-0474-422a-bc3c-4424e08c17b0\") " pod="openshift-ingress/router-default-5444994796-mlb5q" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.495161 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4e2ba5fe-e94f-4d40-8596-d4b9ad863215-cert\") pod \"ingress-canary-2jlb9\" (UID: \"4e2ba5fe-e94f-4d40-8596-d4b9ad863215\") " pod="openshift-ingress-canary/ingress-canary-2jlb9" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.495488 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/83320069-4d0c-4e40-8a7f-b3bc4beda7a1-secret-volume\") pod \"collect-profiles-29335905-4jslx\" (UID: \"83320069-4d0c-4e40-8a7f-b3bc4beda7a1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29335905-4jslx" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.498034 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/beb46851-cb80-4329-85b4-c2c87ebeb836-webhook-cert\") pod \"packageserver-d55dfcdfc-vmsw5\" (UID: \"beb46851-cb80-4329-85b4-c2c87ebeb836\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vmsw5" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.505163 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/291efcc3-da30-4a56-855c-8ee24e840b26-signing-key\") pod \"service-ca-9c57cc56f-8wq6t\" (UID: \"291efcc3-da30-4a56-855c-8ee24e840b26\") " pod="openshift-service-ca/service-ca-9c57cc56f-8wq6t" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.511780 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/92095718-20ba-4b03-b949-e9a26009e283-bound-sa-token\") pod \"image-registry-697d97f7c8-q44w9\" (UID: \"92095718-20ba-4b03-b949-e9a26009e283\") " pod="openshift-image-registry/image-registry-697d97f7c8-q44w9" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.529918 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-xh9xx" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.537941 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a9831579-a012-4c7d-a48c-2c0eee55cc36-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-8whdq\" (UID: \"a9831579-a012-4c7d-a48c-2c0eee55cc36\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-8whdq" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.556849 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7580d2bf-4b66-4dfb-9693-ebcc64225c58-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-bcgcx\" (UID: \"7580d2bf-4b66-4dfb-9693-ebcc64225c58\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-bcgcx" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.568336 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f62c5\" (UniqueName: \"kubernetes.io/projected/5878306c-90e4-4924-b23c-90079c35d2dc-kube-api-access-f62c5\") pod \"ingress-operator-5b745b69d9-4nf42\" (UID: \"5878306c-90e4-4924-b23c-90079c35d2dc\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4nf42" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.576691 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 11 03:55:51 crc kubenswrapper[4703]: E1011 03:55:51.577286 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-11 03:55:52.077265529 +0000 UTC m=+43.287747451 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.582211 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c95nl\" (UniqueName: \"kubernetes.io/projected/0b2f916a-1f36-4ff0-96df-135bcfc2d5f8-kube-api-access-c95nl\") pod \"catalog-operator-68c6474976-dr9rm\" (UID: \"0b2f916a-1f36-4ff0-96df-135bcfc2d5f8\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-dr9rm" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.604345 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-9lb9d" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.610376 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5878306c-90e4-4924-b23c-90079c35d2dc-bound-sa-token\") pod \"ingress-operator-5b745b69d9-4nf42\" (UID: \"5878306c-90e4-4924-b23c-90079c35d2dc\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4nf42" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.632738 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4ffhd\" (UniqueName: \"kubernetes.io/projected/9425edee-98a4-4086-8d31-28478469501b-kube-api-access-4ffhd\") pod \"route-controller-manager-6576b87f9c-rz7sl\" (UID: \"9425edee-98a4-4086-8d31-28478469501b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rz7sl" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.653109 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vnqrh\" (UniqueName: \"kubernetes.io/projected/9b135157-3c40-4f85-84a4-5c521ccbf1bd-kube-api-access-vnqrh\") pod \"apiserver-76f77b778f-pb92d\" (UID: \"9b135157-3c40-4f85-84a4-5c521ccbf1bd\") " pod="openshift-apiserver/apiserver-76f77b778f-pb92d" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.661717 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-q8rmz" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.663289 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-nt5pr"] Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.671706 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rz7sl" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.679666 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q44w9\" (UID: \"92095718-20ba-4b03-b949-e9a26009e283\") " pod="openshift-image-registry/image-registry-697d97f7c8-q44w9" Oct 11 03:55:51 crc kubenswrapper[4703]: E1011 03:55:51.679981 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-11 03:55:52.17997046 +0000 UTC m=+43.390452382 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q44w9" (UID: "92095718-20ba-4b03-b949-e9a26009e283") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 11 03:55:51 crc kubenswrapper[4703]: W1011 03:55:51.686684 4703 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod99ddef85_eb82_4975_aaad_83f550825f30.slice/crio-890d18f1755e513f7b106e894d209f5a69a041c08dfea9bf3113b1675933de57 WatchSource:0}: Error finding container 890d18f1755e513f7b106e894d209f5a69a041c08dfea9bf3113b1675933de57: Status 404 returned error can't find the container with id 890d18f1755e513f7b106e894d209f5a69a041c08dfea9bf3113b1675933de57 Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.688347 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-98w9g\" (UniqueName: \"kubernetes.io/projected/92095718-20ba-4b03-b949-e9a26009e283-kube-api-access-98w9g\") pod \"image-registry-697d97f7c8-q44w9\" (UID: \"92095718-20ba-4b03-b949-e9a26009e283\") " pod="openshift-image-registry/image-registry-697d97f7c8-q44w9" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.702372 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g7b9c\" (UniqueName: \"kubernetes.io/projected/b70b31eb-da80-4680-988d-7dd5d8ab1fe6-kube-api-access-g7b9c\") pod \"openshift-controller-manager-operator-756b6f6bc6-bdzmz\" (UID: \"b70b31eb-da80-4680-988d-7dd5d8ab1fe6\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-bdzmz" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.719377 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-6rj5d"] Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.721824 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-dr9rm" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.725282 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-88q5n\" (UniqueName: \"kubernetes.io/projected/c456c157-fdbc-483e-b2ca-ceb0d2b6247b-kube-api-access-88q5n\") pod \"cluster-image-registry-operator-dc59b4c8b-q926p\" (UID: \"c456c157-fdbc-483e-b2ca-ceb0d2b6247b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-q926p" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.735611 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-bcgcx" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.742196 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-8whdq" Oct 11 03:55:51 crc kubenswrapper[4703]: W1011 03:55:51.743033 4703 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode25f5a36_9626_41eb_82b0_6f33dfaf3001.slice/crio-412fb591fa77bcb678af5287816984f6669edfb95ea792cfb2110c3c499082ca WatchSource:0}: Error finding container 412fb591fa77bcb678af5287816984f6669edfb95ea792cfb2110c3c499082ca: Status 404 returned error can't find the container with id 412fb591fa77bcb678af5287816984f6669edfb95ea792cfb2110c3c499082ca Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.746677 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4nf42" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.750959 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vgf4w\" (UniqueName: \"kubernetes.io/projected/4e1fed36-f837-4e3d-86fa-a4f05ef0a7e9-kube-api-access-vgf4w\") pod \"machine-approver-56656f9798-842hf\" (UID: \"4e1fed36-f837-4e3d-86fa-a4f05ef0a7e9\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-842hf" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.755607 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-w79s7" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.762890 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-xh9xx"] Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.763593 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-bdzmz" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.775558 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7cs8d\" (UniqueName: \"kubernetes.io/projected/34cb015e-a7ba-4cea-a2fc-ab3b101299ce-kube-api-access-7cs8d\") pod \"console-f9d7485db-m5qlx\" (UID: \"34cb015e-a7ba-4cea-a2fc-ab3b101299ce\") " pod="openshift-console/console-f9d7485db-m5qlx" Oct 11 03:55:51 crc kubenswrapper[4703]: W1011 03:55:51.779534 4703 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3e81d771_d34a_4a73_b0fd_0ea7e72f9e25.slice/crio-cc53b56fff17591f028e7d261d680918664308b14d539ed8ac1238a2a516cc87 WatchSource:0}: Error finding container cc53b56fff17591f028e7d261d680918664308b14d539ed8ac1238a2a516cc87: Status 404 returned error can't find the container with id cc53b56fff17591f028e7d261d680918664308b14d539ed8ac1238a2a516cc87 Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.780686 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.780778 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-79f5m\" (UniqueName: \"kubernetes.io/projected/b7f45d1c-1e38-42a4-a4e6-b682787d73cd-kube-api-access-79f5m\") pod \"console-operator-58897d9998-vmhjk\" (UID: \"b7f45d1c-1e38-42a4-a4e6-b682787d73cd\") " pod="openshift-console-operator/console-operator-58897d9998-vmhjk" Oct 11 03:55:51 crc kubenswrapper[4703]: E1011 03:55:51.780865 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-11 03:55:52.280833561 +0000 UTC m=+43.491315483 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.781100 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q44w9\" (UID: \"92095718-20ba-4b03-b949-e9a26009e283\") " pod="openshift-image-registry/image-registry-697d97f7c8-q44w9" Oct 11 03:55:51 crc kubenswrapper[4703]: E1011 03:55:51.781544 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-11 03:55:52.281536099 +0000 UTC m=+43.492018211 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q44w9" (UID: "92095718-20ba-4b03-b949-e9a26009e283") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.803637 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xr6bd\" (UniqueName: \"kubernetes.io/projected/be7d6fb4-1943-4d7f-87a8-ce906ed31cf0-kube-api-access-xr6bd\") pod \"oauth-openshift-558db77b4-bz6vn\" (UID: \"be7d6fb4-1943-4d7f-87a8-ce906ed31cf0\") " pod="openshift-authentication/oauth-openshift-558db77b4-bz6vn" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.826288 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8fwp8\" (UniqueName: \"kubernetes.io/projected/be037531-a087-4b3e-9327-f28d38fd3961-kube-api-access-8fwp8\") pod \"cluster-samples-operator-665b6dd947-h9vv2\" (UID: \"be037531-a087-4b3e-9327-f28d38fd3961\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-h9vv2" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.861868 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-9lb9d"] Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.863491 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c456c157-fdbc-483e-b2ca-ceb0d2b6247b-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-q926p\" (UID: \"c456c157-fdbc-483e-b2ca-ceb0d2b6247b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-q926p" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.871822 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-64v9p\" (UniqueName: \"kubernetes.io/projected/cffc1277-9912-47dd-80a3-d724149e420c-kube-api-access-64v9p\") pod \"apiserver-7bbb656c7d-4fv46\" (UID: \"cffc1277-9912-47dd-80a3-d724149e420c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4fv46" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.882048 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.882098 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-vmhjk" Oct 11 03:55:51 crc kubenswrapper[4703]: E1011 03:55:51.882436 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-11 03:55:52.38239397 +0000 UTC m=+43.592875892 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.882531 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q44w9\" (UID: \"92095718-20ba-4b03-b949-e9a26009e283\") " pod="openshift-image-registry/image-registry-697d97f7c8-q44w9" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.882900 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4qm7z\" (UniqueName: \"kubernetes.io/projected/fd6eb215-d833-4726-b13c-54c53ca1e7e5-kube-api-access-4qm7z\") pod \"package-server-manager-789f6589d5-75knh\" (UID: \"fd6eb215-d833-4726-b13c-54c53ca1e7e5\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-75knh" Oct 11 03:55:51 crc kubenswrapper[4703]: E1011 03:55:51.882932 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-11 03:55:52.382914355 +0000 UTC m=+43.593396377 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q44w9" (UID: "92095718-20ba-4b03-b949-e9a26009e283") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.886501 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-m5qlx" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.908703 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-xh9xx" event={"ID":"3e81d771-d34a-4a73-b0fd-0ea7e72f9e25","Type":"ContainerStarted","Data":"cc53b56fff17591f028e7d261d680918664308b14d539ed8ac1238a2a516cc87"} Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.911957 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-pb92d" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.913637 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fc6bj\" (UniqueName: \"kubernetes.io/projected/01618421-523c-44ab-b631-f2624ba8ab2d-kube-api-access-fc6bj\") pod \"cni-sysctl-allowlist-ds-69n87\" (UID: \"01618421-523c-44ab-b631-f2624ba8ab2d\") " pod="openshift-multus/cni-sysctl-allowlist-ds-69n87" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.931414 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-fn9dd" event={"ID":"990fd4dc-4606-469a-9ced-8f434c2df124","Type":"ContainerStarted","Data":"23ac84f20a46040c1d68dee8d68a40ce74ddf64bf3b31e959cea6641098f82a9"} Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.935159 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cst4c\" (UniqueName: \"kubernetes.io/projected/291efcc3-da30-4a56-855c-8ee24e840b26-kube-api-access-cst4c\") pod \"service-ca-9c57cc56f-8wq6t\" (UID: \"291efcc3-da30-4a56-855c-8ee24e840b26\") " pod="openshift-service-ca/service-ca-9c57cc56f-8wq6t" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.935530 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-69n87" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.937754 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-fn9dd" event={"ID":"990fd4dc-4606-469a-9ced-8f434c2df124","Type":"ContainerStarted","Data":"877200a8844aa195b2a792acfe872c6435605f48bf38f6c03e58c608d646eb82"} Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.937817 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-fn9dd" event={"ID":"990fd4dc-4606-469a-9ced-8f434c2df124","Type":"ContainerStarted","Data":"a57f4c19a8456de90c0a9e4169331906b67f25414eb4a5025959e6e0254fd7e2"} Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.940692 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-nt5pr" event={"ID":"99ddef85-eb82-4975-aaad-83f550825f30","Type":"ContainerStarted","Data":"890d18f1755e513f7b106e894d209f5a69a041c08dfea9bf3113b1675933de57"} Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.947017 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sh7lt\" (UniqueName: \"kubernetes.io/projected/dfc1634d-27c3-4dd8-a8b4-0d9177961ee9-kube-api-access-sh7lt\") pod \"etcd-operator-b45778765-dfgkf\" (UID: \"dfc1634d-27c3-4dd8-a8b4-0d9177961ee9\") " pod="openshift-etcd-operator/etcd-operator-b45778765-dfgkf" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.950263 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-842hf" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.954703 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-dnf85" event={"ID":"f5fe968e-3846-4f87-a7b9-4fdb9d945eab","Type":"ContainerStarted","Data":"6a49211d253fff91e4b5ec08cb3ca4106b5d7b265e603ccbfea1e501f512c00e"} Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.954759 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-dnf85" event={"ID":"f5fe968e-3846-4f87-a7b9-4fdb9d945eab","Type":"ContainerStarted","Data":"067b2efb096086bd3d5420b71c37cfc065dbdc1c0999447663d265c7863cb787"} Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.955217 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-dnf85" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.965381 4703 patch_prober.go:28] interesting pod/downloads-7954f5f757-dnf85 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.5:8080/\": dial tcp 10.217.0.5:8080: connect: connection refused" start-of-body= Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.965444 4703 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-dnf85" podUID="f5fe968e-3846-4f87-a7b9-4fdb9d945eab" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.5:8080/\": dial tcp 10.217.0.5:8080: connect: connection refused" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.966280 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-6rj5d" event={"ID":"e25f5a36-9626-41eb-82b0-6f33dfaf3001","Type":"ContainerStarted","Data":"412fb591fa77bcb678af5287816984f6669edfb95ea792cfb2110c3c499082ca"} Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.969280 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b8bhl\" (UniqueName: \"kubernetes.io/projected/8fa6f46d-74a5-4e8b-b112-55ed455b5409-kube-api-access-b8bhl\") pod \"service-ca-operator-777779d784-psx9s\" (UID: \"8fa6f46d-74a5-4e8b-b112-55ed455b5409\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-psx9s" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.982872 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-q926p" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.984265 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.984779 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-q458x" event={"ID":"01868ae6-6122-48a8-bc3a-3cb62cf08ac2","Type":"ContainerStarted","Data":"d02e7b552bf6472e0a02ddac30cc1296479e29ccb49b3fb00a40c936e7b33d34"} Oct 11 03:55:51 crc kubenswrapper[4703]: E1011 03:55:51.985682 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-11 03:55:52.485651126 +0000 UTC m=+43.696133048 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.986267 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l6ms5\" (UniqueName: \"kubernetes.io/projected/95f6c15c-5422-4ef6-a4a8-108959afcae6-kube-api-access-l6ms5\") pod \"control-plane-machine-set-operator-78cbb6b69f-8wrsq\" (UID: \"95f6c15c-5422-4ef6-a4a8-108959afcae6\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-8wrsq" Oct 11 03:55:51 crc kubenswrapper[4703]: I1011 03:55:51.986548 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-bz6vn" Oct 11 03:55:52 crc kubenswrapper[4703]: I1011 03:55:52.013892 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4fv46" Oct 11 03:55:52 crc kubenswrapper[4703]: I1011 03:55:52.015761 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-h9vv2" Oct 11 03:55:52 crc kubenswrapper[4703]: I1011 03:55:52.016093 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-znkwt\" (UniqueName: \"kubernetes.io/projected/4e2ba5fe-e94f-4d40-8596-d4b9ad863215-kube-api-access-znkwt\") pod \"ingress-canary-2jlb9\" (UID: \"4e2ba5fe-e94f-4d40-8596-d4b9ad863215\") " pod="openshift-ingress-canary/ingress-canary-2jlb9" Oct 11 03:55:52 crc kubenswrapper[4703]: I1011 03:55:52.040141 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nb2qg\" (UniqueName: \"kubernetes.io/projected/34d64762-444d-41a0-bf17-d8de6712a90f-kube-api-access-nb2qg\") pod \"machine-config-controller-84d6567774-4v67m\" (UID: \"34d64762-444d-41a0-bf17-d8de6712a90f\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-4v67m" Oct 11 03:55:52 crc kubenswrapper[4703]: I1011 03:55:52.061032 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-twlw5\" (UniqueName: \"kubernetes.io/projected/0b4045ea-f5af-4576-9ada-baddba2cc319-kube-api-access-twlw5\") pod \"olm-operator-6b444d44fb-z7wfl\" (UID: \"0b4045ea-f5af-4576-9ada-baddba2cc319\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-z7wfl" Oct 11 03:55:52 crc kubenswrapper[4703]: I1011 03:55:52.065589 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wd47f\" (UniqueName: \"kubernetes.io/projected/886a9919-144b-4a33-81ac-90b79512aa6a-kube-api-access-wd47f\") pod \"machine-config-operator-74547568cd-b5mqn\" (UID: \"886a9919-144b-4a33-81ac-90b79512aa6a\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-b5mqn" Oct 11 03:55:52 crc kubenswrapper[4703]: I1011 03:55:52.071863 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-75knh" Oct 11 03:55:52 crc kubenswrapper[4703]: I1011 03:55:52.085157 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-z7wfl" Oct 11 03:55:52 crc kubenswrapper[4703]: I1011 03:55:52.085551 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tbzsb\" (UniqueName: \"kubernetes.io/projected/83320069-4d0c-4e40-8a7f-b3bc4beda7a1-kube-api-access-tbzsb\") pod \"collect-profiles-29335905-4jslx\" (UID: \"83320069-4d0c-4e40-8a7f-b3bc4beda7a1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29335905-4jslx" Oct 11 03:55:52 crc kubenswrapper[4703]: I1011 03:55:52.086033 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q44w9\" (UID: \"92095718-20ba-4b03-b949-e9a26009e283\") " pod="openshift-image-registry/image-registry-697d97f7c8-q44w9" Oct 11 03:55:52 crc kubenswrapper[4703]: E1011 03:55:52.086829 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-11 03:55:52.586810005 +0000 UTC m=+43.797291927 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q44w9" (UID: "92095718-20ba-4b03-b949-e9a26009e283") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 11 03:55:52 crc kubenswrapper[4703]: I1011 03:55:52.093277 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-dfgkf" Oct 11 03:55:52 crc kubenswrapper[4703]: I1011 03:55:52.100229 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-4v67m" Oct 11 03:55:52 crc kubenswrapper[4703]: I1011 03:55:52.114747 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-b5mqn" Oct 11 03:55:52 crc kubenswrapper[4703]: I1011 03:55:52.116027 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nx4ks\" (UniqueName: \"kubernetes.io/projected/5938c6ea-4d1a-4ca7-bbc3-0bf2b1e9e6bb-kube-api-access-nx4ks\") pod \"multus-admission-controller-857f4d67dd-hrr65\" (UID: \"5938c6ea-4d1a-4ca7-bbc3-0bf2b1e9e6bb\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-hrr65" Oct 11 03:55:52 crc kubenswrapper[4703]: I1011 03:55:52.122774 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-hrr65" Oct 11 03:55:52 crc kubenswrapper[4703]: I1011 03:55:52.128066 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kn4n5\" (UniqueName: \"kubernetes.io/projected/2c0331e0-7b42-4227-8d50-d4c66a927dea-kube-api-access-kn4n5\") pod \"kube-storage-version-migrator-operator-b67b599dd-b5vqb\" (UID: \"2c0331e0-7b42-4227-8d50-d4c66a927dea\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-b5vqb" Oct 11 03:55:52 crc kubenswrapper[4703]: I1011 03:55:52.132056 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-b5vqb" Oct 11 03:55:52 crc kubenswrapper[4703]: I1011 03:55:52.145936 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q86hk\" (UniqueName: \"kubernetes.io/projected/5b7cae24-659a-4bda-b925-d74698801dd9-kube-api-access-q86hk\") pod \"dns-default-rf7jf\" (UID: \"5b7cae24-659a-4bda-b925-d74698801dd9\") " pod="openshift-dns/dns-default-rf7jf" Oct 11 03:55:52 crc kubenswrapper[4703]: I1011 03:55:52.149705 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-8wq6t" Oct 11 03:55:52 crc kubenswrapper[4703]: I1011 03:55:52.158296 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29335905-4jslx" Oct 11 03:55:52 crc kubenswrapper[4703]: I1011 03:55:52.171642 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wxrkv\" (UniqueName: \"kubernetes.io/projected/3516a340-0474-422a-bc3c-4424e08c17b0-kube-api-access-wxrkv\") pod \"router-default-5444994796-mlb5q\" (UID: \"3516a340-0474-422a-bc3c-4424e08c17b0\") " pod="openshift-ingress/router-default-5444994796-mlb5q" Oct 11 03:55:52 crc kubenswrapper[4703]: I1011 03:55:52.178732 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-8wrsq" Oct 11 03:55:52 crc kubenswrapper[4703]: I1011 03:55:52.186968 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 11 03:55:52 crc kubenswrapper[4703]: E1011 03:55:52.187094 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-11 03:55:52.68706713 +0000 UTC m=+43.897549052 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 11 03:55:52 crc kubenswrapper[4703]: I1011 03:55:52.187263 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q44w9\" (UID: \"92095718-20ba-4b03-b949-e9a26009e283\") " pod="openshift-image-registry/image-registry-697d97f7c8-q44w9" Oct 11 03:55:52 crc kubenswrapper[4703]: E1011 03:55:52.187706 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-11 03:55:52.687689876 +0000 UTC m=+43.898171798 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q44w9" (UID: "92095718-20ba-4b03-b949-e9a26009e283") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 11 03:55:52 crc kubenswrapper[4703]: I1011 03:55:52.194450 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-55tmj\" (UniqueName: \"kubernetes.io/projected/01e5cd3c-edc8-486e-9b3c-9ec1ca513fcc-kube-api-access-55tmj\") pod \"machine-config-server-6588z\" (UID: \"01e5cd3c-edc8-486e-9b3c-9ec1ca513fcc\") " pod="openshift-machine-config-operator/machine-config-server-6588z" Oct 11 03:55:52 crc kubenswrapper[4703]: I1011 03:55:52.224930 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-psx9s" Oct 11 03:55:52 crc kubenswrapper[4703]: I1011 03:55:52.230020 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ssw2c\" (UniqueName: \"kubernetes.io/projected/beb46851-cb80-4329-85b4-c2c87ebeb836-kube-api-access-ssw2c\") pod \"packageserver-d55dfcdfc-vmsw5\" (UID: \"beb46851-cb80-4329-85b4-c2c87ebeb836\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vmsw5" Oct 11 03:55:52 crc kubenswrapper[4703]: I1011 03:55:52.244785 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-rf7jf" Oct 11 03:55:52 crc kubenswrapper[4703]: I1011 03:55:52.256351 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qtg94\" (UniqueName: \"kubernetes.io/projected/c7664ffa-e88d-468b-b43d-3b3b6ec5195b-kube-api-access-qtg94\") pod \"marketplace-operator-79b997595-tjd96\" (UID: \"c7664ffa-e88d-468b-b43d-3b3b6ec5195b\") " pod="openshift-marketplace/marketplace-operator-79b997595-tjd96" Oct 11 03:55:52 crc kubenswrapper[4703]: I1011 03:55:52.258168 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ncmqc\" (UniqueName: \"kubernetes.io/projected/59fc390f-24d6-4f16-855b-475f50c6f4b0-kube-api-access-ncmqc\") pod \"csi-hostpathplugin-728m8\" (UID: \"59fc390f-24d6-4f16-855b-475f50c6f4b0\") " pod="hostpath-provisioner/csi-hostpathplugin-728m8" Oct 11 03:55:52 crc kubenswrapper[4703]: I1011 03:55:52.259841 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-728m8" Oct 11 03:55:52 crc kubenswrapper[4703]: I1011 03:55:52.269984 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-2jlb9" Oct 11 03:55:52 crc kubenswrapper[4703]: I1011 03:55:52.289143 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 11 03:55:52 crc kubenswrapper[4703]: E1011 03:55:52.289498 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-11 03:55:52.789479963 +0000 UTC m=+43.999961885 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 11 03:55:52 crc kubenswrapper[4703]: I1011 03:55:52.393715 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q44w9\" (UID: \"92095718-20ba-4b03-b949-e9a26009e283\") " pod="openshift-image-registry/image-registry-697d97f7c8-q44w9" Oct 11 03:55:52 crc kubenswrapper[4703]: E1011 03:55:52.394581 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-11 03:55:52.894564886 +0000 UTC m=+44.105046808 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q44w9" (UID: "92095718-20ba-4b03-b949-e9a26009e283") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 11 03:55:52 crc kubenswrapper[4703]: I1011 03:55:52.415010 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-bcgcx"] Oct 11 03:55:52 crc kubenswrapper[4703]: I1011 03:55:52.444137 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vmsw5" Oct 11 03:55:52 crc kubenswrapper[4703]: I1011 03:55:52.452047 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-mlb5q" Oct 11 03:55:52 crc kubenswrapper[4703]: I1011 03:55:52.466120 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-tjd96" Oct 11 03:55:52 crc kubenswrapper[4703]: I1011 03:55:52.472729 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-6588z" Oct 11 03:55:52 crc kubenswrapper[4703]: I1011 03:55:52.495720 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 11 03:55:52 crc kubenswrapper[4703]: E1011 03:55:52.496271 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-11 03:55:52.99625144 +0000 UTC m=+44.206733362 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 11 03:55:52 crc kubenswrapper[4703]: I1011 03:55:52.522950 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-q8rmz"] Oct 11 03:55:52 crc kubenswrapper[4703]: I1011 03:55:52.541670 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-rz7sl"] Oct 11 03:55:52 crc kubenswrapper[4703]: I1011 03:55:52.597965 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q44w9\" (UID: \"92095718-20ba-4b03-b949-e9a26009e283\") " pod="openshift-image-registry/image-registry-697d97f7c8-q44w9" Oct 11 03:55:52 crc kubenswrapper[4703]: E1011 03:55:52.598331 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-11 03:55:53.098318142 +0000 UTC m=+44.308800064 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q44w9" (UID: "92095718-20ba-4b03-b949-e9a26009e283") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 11 03:55:52 crc kubenswrapper[4703]: I1011 03:55:52.698715 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 11 03:55:52 crc kubenswrapper[4703]: E1011 03:55:52.698995 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-11 03:55:53.198957088 +0000 UTC m=+44.409439010 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 11 03:55:52 crc kubenswrapper[4703]: I1011 03:55:52.699261 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q44w9\" (UID: \"92095718-20ba-4b03-b949-e9a26009e283\") " pod="openshift-image-registry/image-registry-697d97f7c8-q44w9" Oct 11 03:55:52 crc kubenswrapper[4703]: E1011 03:55:52.699648 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-11 03:55:53.199632066 +0000 UTC m=+44.410113988 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q44w9" (UID: "92095718-20ba-4b03-b949-e9a26009e283") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 11 03:55:52 crc kubenswrapper[4703]: I1011 03:55:52.736671 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-m5qlx"] Oct 11 03:55:52 crc kubenswrapper[4703]: I1011 03:55:52.768696 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-dr9rm"] Oct 11 03:55:52 crc kubenswrapper[4703]: I1011 03:55:52.784266 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-bdzmz"] Oct 11 03:55:52 crc kubenswrapper[4703]: I1011 03:55:52.785942 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-4nf42"] Oct 11 03:55:52 crc kubenswrapper[4703]: I1011 03:55:52.787702 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-w79s7"] Oct 11 03:55:52 crc kubenswrapper[4703]: I1011 03:55:52.795522 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-8whdq"] Oct 11 03:55:52 crc kubenswrapper[4703]: I1011 03:55:52.800210 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 11 03:55:52 crc kubenswrapper[4703]: E1011 03:55:52.800377 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-11 03:55:53.300356354 +0000 UTC m=+44.510838276 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 11 03:55:52 crc kubenswrapper[4703]: I1011 03:55:52.800819 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q44w9\" (UID: \"92095718-20ba-4b03-b949-e9a26009e283\") " pod="openshift-image-registry/image-registry-697d97f7c8-q44w9" Oct 11 03:55:52 crc kubenswrapper[4703]: E1011 03:55:52.801120 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-11 03:55:53.301111584 +0000 UTC m=+44.511593506 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q44w9" (UID: "92095718-20ba-4b03-b949-e9a26009e283") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 11 03:55:52 crc kubenswrapper[4703]: I1011 03:55:52.913950 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 11 03:55:52 crc kubenswrapper[4703]: E1011 03:55:52.914824 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-11 03:55:53.414796796 +0000 UTC m=+44.625278718 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 11 03:55:52 crc kubenswrapper[4703]: W1011 03:55:52.968812 4703 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod34cb015e_a7ba_4cea_a2fc_ab3b101299ce.slice/crio-664bb5d39c573a8ec96c393c72f4bcfe12874a27280abf911bb77202e8866a29 WatchSource:0}: Error finding container 664bb5d39c573a8ec96c393c72f4bcfe12874a27280abf911bb77202e8866a29: Status 404 returned error can't find the container with id 664bb5d39c573a8ec96c393c72f4bcfe12874a27280abf911bb77202e8866a29 Oct 11 03:55:52 crc kubenswrapper[4703]: W1011 03:55:52.988718 4703 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0b2f916a_1f36_4ff0_96df_135bcfc2d5f8.slice/crio-8a02ee0c8e4f9f7ea76f20b257dbc1d5c6de05c1e72f768f4d233f7f85cfbe3b WatchSource:0}: Error finding container 8a02ee0c8e4f9f7ea76f20b257dbc1d5c6de05c1e72f768f4d233f7f85cfbe3b: Status 404 returned error can't find the container with id 8a02ee0c8e4f9f7ea76f20b257dbc1d5c6de05c1e72f768f4d233f7f85cfbe3b Oct 11 03:55:52 crc kubenswrapper[4703]: I1011 03:55:52.995704 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-842hf" event={"ID":"4e1fed36-f837-4e3d-86fa-a4f05ef0a7e9","Type":"ContainerStarted","Data":"c3550a210c63e2a19e31541e849027bd13dee2760feb6e0e1b82bb18ab534122"} Oct 11 03:55:53 crc kubenswrapper[4703]: I1011 03:55:53.008684 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-69n87" event={"ID":"01618421-523c-44ab-b631-f2624ba8ab2d","Type":"ContainerStarted","Data":"a40d99f678ccb88a8e3c7876ce75ddc6fa397760b26ff87effa4a1ed05d7f35b"} Oct 11 03:55:53 crc kubenswrapper[4703]: I1011 03:55:53.008722 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-69n87" event={"ID":"01618421-523c-44ab-b631-f2624ba8ab2d","Type":"ContainerStarted","Data":"2a712460999be14f5cfe2b01b79b3090e05fa72aaeb2260e094e7e3f2016bb01"} Oct 11 03:55:53 crc kubenswrapper[4703]: I1011 03:55:53.009658 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-multus/cni-sysctl-allowlist-ds-69n87" Oct 11 03:55:53 crc kubenswrapper[4703]: I1011 03:55:53.016648 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q44w9\" (UID: \"92095718-20ba-4b03-b949-e9a26009e283\") " pod="openshift-image-registry/image-registry-697d97f7c8-q44w9" Oct 11 03:55:53 crc kubenswrapper[4703]: E1011 03:55:53.018193 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-11 03:55:53.518173424 +0000 UTC m=+44.728655346 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q44w9" (UID: "92095718-20ba-4b03-b949-e9a26009e283") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 11 03:55:53 crc kubenswrapper[4703]: I1011 03:55:53.018331 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rz7sl" event={"ID":"9425edee-98a4-4086-8d31-28478469501b","Type":"ContainerStarted","Data":"880ce472699b61afb84f98f7edf57a941e8e557d202791e27418818d6e418c03"} Oct 11 03:55:53 crc kubenswrapper[4703]: I1011 03:55:53.028089 4703 generic.go:334] "Generic (PLEG): container finished" podID="3e81d771-d34a-4a73-b0fd-0ea7e72f9e25" containerID="0b4ada77668d7d31f96186577253ced1ffdd285459f3ec0b0bb4e9a4d8284199" exitCode=0 Oct 11 03:55:53 crc kubenswrapper[4703]: I1011 03:55:53.028174 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-xh9xx" event={"ID":"3e81d771-d34a-4a73-b0fd-0ea7e72f9e25","Type":"ContainerDied","Data":"0b4ada77668d7d31f96186577253ced1ffdd285459f3ec0b0bb4e9a4d8284199"} Oct 11 03:55:53 crc kubenswrapper[4703]: I1011 03:55:53.039327 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-6rj5d" event={"ID":"e25f5a36-9626-41eb-82b0-6f33dfaf3001","Type":"ContainerStarted","Data":"96965f8bcbe71240bfd91a6fb712578b5c225d8ccea13cc4fa1ff2866d98dec9"} Oct 11 03:55:53 crc kubenswrapper[4703]: I1011 03:55:53.042219 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-75knh"] Oct 11 03:55:53 crc kubenswrapper[4703]: I1011 03:55:53.042503 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-m5qlx" event={"ID":"34cb015e-a7ba-4cea-a2fc-ab3b101299ce","Type":"ContainerStarted","Data":"664bb5d39c573a8ec96c393c72f4bcfe12874a27280abf911bb77202e8866a29"} Oct 11 03:55:53 crc kubenswrapper[4703]: I1011 03:55:53.045455 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-9lb9d" event={"ID":"ff55314a-7f2f-42c9-9723-2fdaad791959","Type":"ContainerStarted","Data":"d4eaf64f7a44f58a84ad5c96c6492d13e9c9ea5d167e03ee91103a7634f9af54"} Oct 11 03:55:53 crc kubenswrapper[4703]: I1011 03:55:53.045518 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-9lb9d" event={"ID":"ff55314a-7f2f-42c9-9723-2fdaad791959","Type":"ContainerStarted","Data":"5e33f3193171c7eab23141033858cca2b4f7c6facbc936d19c5dfd714d09d480"} Oct 11 03:55:53 crc kubenswrapper[4703]: I1011 03:55:53.045891 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-9lb9d" Oct 11 03:55:53 crc kubenswrapper[4703]: I1011 03:55:53.047475 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-nt5pr" event={"ID":"99ddef85-eb82-4975-aaad-83f550825f30","Type":"ContainerStarted","Data":"304f318c95b01431a589502d75b1eaf30533a3c3b60a02477a7d4c7100151682"} Oct 11 03:55:53 crc kubenswrapper[4703]: I1011 03:55:53.047539 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-nt5pr" event={"ID":"99ddef85-eb82-4975-aaad-83f550825f30","Type":"ContainerStarted","Data":"b601b0d3466bb84efd6ce15d0f6e724f1e504c011ca962d285c1e9fdf9d9ed2d"} Oct 11 03:55:53 crc kubenswrapper[4703]: I1011 03:55:53.048977 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-mlb5q" event={"ID":"3516a340-0474-422a-bc3c-4424e08c17b0","Type":"ContainerStarted","Data":"b89c1883361bd77c7bac7884bf1fbaff2be98bec677be46c575d1680272671af"} Oct 11 03:55:53 crc kubenswrapper[4703]: I1011 03:55:53.056720 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-vmhjk"] Oct 11 03:55:53 crc kubenswrapper[4703]: I1011 03:55:53.057644 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-bcgcx" event={"ID":"7580d2bf-4b66-4dfb-9693-ebcc64225c58","Type":"ContainerStarted","Data":"6be71b31ed4ea081e6b0f7717f31927cb092786d986ccf73233aee86c39c3332"} Oct 11 03:55:53 crc kubenswrapper[4703]: I1011 03:55:53.059592 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-q8rmz" event={"ID":"62aed52a-c801-4ec0-b96b-6132de0e4200","Type":"ContainerStarted","Data":"da85aa2600619afc6c65e1370d172421865a1ae98a1bd6a96e0f67e3c0a32c16"} Oct 11 03:55:53 crc kubenswrapper[4703]: I1011 03:55:53.063697 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-pb92d"] Oct 11 03:55:53 crc kubenswrapper[4703]: I1011 03:55:53.068415 4703 patch_prober.go:28] interesting pod/downloads-7954f5f757-dnf85 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.5:8080/\": dial tcp 10.217.0.5:8080: connect: connection refused" start-of-body= Oct 11 03:55:53 crc kubenswrapper[4703]: I1011 03:55:53.068839 4703 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-dnf85" podUID="f5fe968e-3846-4f87-a7b9-4fdb9d945eab" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.5:8080/\": dial tcp 10.217.0.5:8080: connect: connection refused" Oct 11 03:55:53 crc kubenswrapper[4703]: I1011 03:55:53.067789 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-6588z" event={"ID":"01e5cd3c-edc8-486e-9b3c-9ec1ca513fcc","Type":"ContainerStarted","Data":"444edbf5a97100c40c73655f733d131e1665ab3ff3bf0c7d0eb41c2090404808"} Oct 11 03:55:53 crc kubenswrapper[4703]: I1011 03:55:53.083354 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-4fv46"] Oct 11 03:55:53 crc kubenswrapper[4703]: I1011 03:55:53.117542 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 11 03:55:53 crc kubenswrapper[4703]: I1011 03:55:53.118126 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8c8bf83f-ab19-43ea-b331-db24e4e97709-metrics-certs\") pod \"network-metrics-daemon-4s5kf\" (UID: \"8c8bf83f-ab19-43ea-b331-db24e4e97709\") " pod="openshift-multus/network-metrics-daemon-4s5kf" Oct 11 03:55:53 crc kubenswrapper[4703]: E1011 03:55:53.120159 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-11 03:55:53.620134155 +0000 UTC m=+44.830616067 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 11 03:55:53 crc kubenswrapper[4703]: I1011 03:55:53.122903 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-multus/cni-sysctl-allowlist-ds-69n87" Oct 11 03:55:53 crc kubenswrapper[4703]: I1011 03:55:53.130396 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8c8bf83f-ab19-43ea-b331-db24e4e97709-metrics-certs\") pod \"network-metrics-daemon-4s5kf\" (UID: \"8c8bf83f-ab19-43ea-b331-db24e4e97709\") " pod="openshift-multus/network-metrics-daemon-4s5kf" Oct 11 03:55:53 crc kubenswrapper[4703]: I1011 03:55:53.163854 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4s5kf" Oct 11 03:55:53 crc kubenswrapper[4703]: I1011 03:55:53.219752 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q44w9\" (UID: \"92095718-20ba-4b03-b949-e9a26009e283\") " pod="openshift-image-registry/image-registry-697d97f7c8-q44w9" Oct 11 03:55:53 crc kubenswrapper[4703]: E1011 03:55:53.220104 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-11 03:55:53.720076502 +0000 UTC m=+44.930558424 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q44w9" (UID: "92095718-20ba-4b03-b949-e9a26009e283") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 11 03:55:53 crc kubenswrapper[4703]: W1011 03:55:53.297448 4703 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfd6eb215_d833_4726_b13c_54c53ca1e7e5.slice/crio-98d83771c9617e24b5661a7ab962102e6c8f5b53046cdf1f9d1561f1c28dbd9f WatchSource:0}: Error finding container 98d83771c9617e24b5661a7ab962102e6c8f5b53046cdf1f9d1561f1c28dbd9f: Status 404 returned error can't find the container with id 98d83771c9617e24b5661a7ab962102e6c8f5b53046cdf1f9d1561f1c28dbd9f Oct 11 03:55:53 crc kubenswrapper[4703]: I1011 03:55:53.307945 4703 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-9lb9d container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.10:8443/healthz\": dial tcp 10.217.0.10:8443: connect: connection refused" start-of-body= Oct 11 03:55:53 crc kubenswrapper[4703]: I1011 03:55:53.308002 4703 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-9lb9d" podUID="ff55314a-7f2f-42c9-9723-2fdaad791959" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.10:8443/healthz\": dial tcp 10.217.0.10:8443: connect: connection refused" Oct 11 03:55:53 crc kubenswrapper[4703]: W1011 03:55:53.312909 4703 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb7f45d1c_1e38_42a4_a4e6_b682787d73cd.slice/crio-e1ce57397beece1c76ec3d338c23fa1c199fb257e83f3c4f50cbb71db7823f5a WatchSource:0}: Error finding container e1ce57397beece1c76ec3d338c23fa1c199fb257e83f3c4f50cbb71db7823f5a: Status 404 returned error can't find the container with id e1ce57397beece1c76ec3d338c23fa1c199fb257e83f3c4f50cbb71db7823f5a Oct 11 03:55:53 crc kubenswrapper[4703]: I1011 03:55:53.320846 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 11 03:55:53 crc kubenswrapper[4703]: E1011 03:55:53.321180 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-11 03:55:53.821164819 +0000 UTC m=+45.031646741 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 11 03:55:53 crc kubenswrapper[4703]: I1011 03:55:53.349830 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-dnf85" podStartSLOduration=18.34981375 podStartE2EDuration="18.34981375s" podCreationTimestamp="2025-10-11 03:55:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 03:55:53.347870639 +0000 UTC m=+44.558352561" watchObservedRunningTime="2025-10-11 03:55:53.34981375 +0000 UTC m=+44.560295672" Oct 11 03:55:53 crc kubenswrapper[4703]: I1011 03:55:53.427283 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q44w9\" (UID: \"92095718-20ba-4b03-b949-e9a26009e283\") " pod="openshift-image-registry/image-registry-697d97f7c8-q44w9" Oct 11 03:55:53 crc kubenswrapper[4703]: E1011 03:55:53.437827 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-11 03:55:53.93780943 +0000 UTC m=+45.148291352 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q44w9" (UID: "92095718-20ba-4b03-b949-e9a26009e283") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 11 03:55:53 crc kubenswrapper[4703]: I1011 03:55:53.538114 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 11 03:55:53 crc kubenswrapper[4703]: E1011 03:55:53.538596 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-11 03:55:54.038579079 +0000 UTC m=+45.249061001 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 11 03:55:53 crc kubenswrapper[4703]: I1011 03:55:53.642515 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q44w9\" (UID: \"92095718-20ba-4b03-b949-e9a26009e283\") " pod="openshift-image-registry/image-registry-697d97f7c8-q44w9" Oct 11 03:55:53 crc kubenswrapper[4703]: E1011 03:55:53.643539 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-11 03:55:54.143522149 +0000 UTC m=+45.354004071 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q44w9" (UID: "92095718-20ba-4b03-b949-e9a26009e283") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 11 03:55:53 crc kubenswrapper[4703]: I1011 03:55:53.661166 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-b5mqn"] Oct 11 03:55:53 crc kubenswrapper[4703]: I1011 03:55:53.689232 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-z7wfl"] Oct 11 03:55:53 crc kubenswrapper[4703]: W1011 03:55:53.711554 4703 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod886a9919_144b_4a33_81ac_90b79512aa6a.slice/crio-e86d9b74590e853cbb6328f52fce81627e034859fa91f70e4ddea7c2426ed583 WatchSource:0}: Error finding container e86d9b74590e853cbb6328f52fce81627e034859fa91f70e4ddea7c2426ed583: Status 404 returned error can't find the container with id e86d9b74590e853cbb6328f52fce81627e034859fa91f70e4ddea7c2426ed583 Oct 11 03:55:53 crc kubenswrapper[4703]: I1011 03:55:53.711619 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-b5vqb"] Oct 11 03:55:53 crc kubenswrapper[4703]: I1011 03:55:53.724646 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-4v67m"] Oct 11 03:55:53 crc kubenswrapper[4703]: I1011 03:55:53.729258 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-8wrsq"] Oct 11 03:55:53 crc kubenswrapper[4703]: I1011 03:55:53.736092 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-hrr65"] Oct 11 03:55:53 crc kubenswrapper[4703]: I1011 03:55:53.743979 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 11 03:55:53 crc kubenswrapper[4703]: E1011 03:55:53.744590 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-11 03:55:54.244557904 +0000 UTC m=+45.455039826 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 11 03:55:53 crc kubenswrapper[4703]: I1011 03:55:53.744998 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q44w9\" (UID: \"92095718-20ba-4b03-b949-e9a26009e283\") " pod="openshift-image-registry/image-registry-697d97f7c8-q44w9" Oct 11 03:55:53 crc kubenswrapper[4703]: E1011 03:55:53.745538 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-11 03:55:54.24552261 +0000 UTC m=+45.456004532 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q44w9" (UID: "92095718-20ba-4b03-b949-e9a26009e283") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 11 03:55:53 crc kubenswrapper[4703]: I1011 03:55:53.751642 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-psx9s"] Oct 11 03:55:53 crc kubenswrapper[4703]: I1011 03:55:53.752769 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-q926p"] Oct 11 03:55:53 crc kubenswrapper[4703]: I1011 03:55:53.775997 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-dfgkf"] Oct 11 03:55:53 crc kubenswrapper[4703]: I1011 03:55:53.793503 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-h9vv2"] Oct 11 03:55:53 crc kubenswrapper[4703]: I1011 03:55:53.801225 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-bz6vn"] Oct 11 03:55:53 crc kubenswrapper[4703]: I1011 03:55:53.803970 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-8wq6t"] Oct 11 03:55:53 crc kubenswrapper[4703]: I1011 03:55:53.813061 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-fn9dd" podStartSLOduration=17.813044905 podStartE2EDuration="17.813044905s" podCreationTimestamp="2025-10-11 03:55:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 03:55:53.810107157 +0000 UTC m=+45.020589079" watchObservedRunningTime="2025-10-11 03:55:53.813044905 +0000 UTC m=+45.023526827" Oct 11 03:55:53 crc kubenswrapper[4703]: I1011 03:55:53.846179 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 11 03:55:53 crc kubenswrapper[4703]: E1011 03:55:53.847183 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-11 03:55:54.347149681 +0000 UTC m=+45.557631603 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 11 03:55:53 crc kubenswrapper[4703]: I1011 03:55:53.849182 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-4s5kf"] Oct 11 03:55:53 crc kubenswrapper[4703]: I1011 03:55:53.850311 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q44w9\" (UID: \"92095718-20ba-4b03-b949-e9a26009e283\") " pod="openshift-image-registry/image-registry-697d97f7c8-q44w9" Oct 11 03:55:53 crc kubenswrapper[4703]: E1011 03:55:53.850590 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-11 03:55:54.350577113 +0000 UTC m=+45.561059035 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q44w9" (UID: "92095718-20ba-4b03-b949-e9a26009e283") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 11 03:55:53 crc kubenswrapper[4703]: I1011 03:55:53.884799 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-rf7jf"] Oct 11 03:55:53 crc kubenswrapper[4703]: I1011 03:55:53.896337 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-q458x" podStartSLOduration=18.896296809 podStartE2EDuration="18.896296809s" podCreationTimestamp="2025-10-11 03:55:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 03:55:53.869930278 +0000 UTC m=+45.080412200" watchObservedRunningTime="2025-10-11 03:55:53.896296809 +0000 UTC m=+45.106778731" Oct 11 03:55:53 crc kubenswrapper[4703]: I1011 03:55:53.901551 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-2jlb9"] Oct 11 03:55:53 crc kubenswrapper[4703]: I1011 03:55:53.904350 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-728m8"] Oct 11 03:55:53 crc kubenswrapper[4703]: I1011 03:55:53.920446 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29335905-4jslx"] Oct 11 03:55:53 crc kubenswrapper[4703]: E1011 03:55:53.996652 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-11 03:55:54.496614685 +0000 UTC m=+45.707096617 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 11 03:55:54 crc kubenswrapper[4703]: I1011 03:55:54.008082 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-tjd96"] Oct 11 03:55:54 crc kubenswrapper[4703]: I1011 03:55:54.008142 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 11 03:55:54 crc kubenswrapper[4703]: I1011 03:55:54.011592 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q44w9\" (UID: \"92095718-20ba-4b03-b949-e9a26009e283\") " pod="openshift-image-registry/image-registry-697d97f7c8-q44w9" Oct 11 03:55:54 crc kubenswrapper[4703]: E1011 03:55:54.012298 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-11 03:55:54.512279101 +0000 UTC m=+45.722761023 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q44w9" (UID: "92095718-20ba-4b03-b949-e9a26009e283") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 11 03:55:54 crc kubenswrapper[4703]: I1011 03:55:54.113762 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 11 03:55:54 crc kubenswrapper[4703]: E1011 03:55:54.114451 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-11 03:55:54.614430057 +0000 UTC m=+45.824911979 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 11 03:55:54 crc kubenswrapper[4703]: I1011 03:55:54.123404 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vmsw5"] Oct 11 03:55:54 crc kubenswrapper[4703]: I1011 03:55:54.162084 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29335905-4jslx" event={"ID":"83320069-4d0c-4e40-8a7f-b3bc4beda7a1","Type":"ContainerStarted","Data":"594a35444e4d33b415b7636780710247c44c4e2ad7c3ef364c4cde3e44ebdcf0"} Oct 11 03:55:54 crc kubenswrapper[4703]: I1011 03:55:54.168032 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-mlb5q" event={"ID":"3516a340-0474-422a-bc3c-4424e08c17b0","Type":"ContainerStarted","Data":"07825c85044bb8c230154c9b6c6aee00c323e35fda38ff0c06f18d54ebfccd5d"} Oct 11 03:55:54 crc kubenswrapper[4703]: I1011 03:55:54.192686 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-bcgcx" event={"ID":"7580d2bf-4b66-4dfb-9693-ebcc64225c58","Type":"ContainerStarted","Data":"d0eaa974adb219700533b8a6c30e5cd257d104843c02e26ef11a3e256a670786"} Oct 11 03:55:54 crc kubenswrapper[4703]: I1011 03:55:54.207573 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-z7wfl" event={"ID":"0b4045ea-f5af-4576-9ada-baddba2cc319","Type":"ContainerStarted","Data":"9c97b64689861c8bf31a065ea8e4f281187ab7a0e2c2bf04b58c7f135f4c259c"} Oct 11 03:55:54 crc kubenswrapper[4703]: I1011 03:55:54.217433 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q44w9\" (UID: \"92095718-20ba-4b03-b949-e9a26009e283\") " pod="openshift-image-registry/image-registry-697d97f7c8-q44w9" Oct 11 03:55:54 crc kubenswrapper[4703]: E1011 03:55:54.218377 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-11 03:55:54.71836211 +0000 UTC m=+45.928844032 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q44w9" (UID: "92095718-20ba-4b03-b949-e9a26009e283") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 11 03:55:54 crc kubenswrapper[4703]: I1011 03:55:54.218601 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-6588z" event={"ID":"01e5cd3c-edc8-486e-9b3c-9ec1ca513fcc","Type":"ContainerStarted","Data":"2fe462f04498d72e638cd993d9b9a7c4b9664409102077d851b001b1480f430e"} Oct 11 03:55:54 crc kubenswrapper[4703]: I1011 03:55:54.225896 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-4s5kf" event={"ID":"8c8bf83f-ab19-43ea-b331-db24e4e97709","Type":"ContainerStarted","Data":"11026830703c27ceb6f89dbad239b49c2fa45ccec0a5c2be21a82d39fa7d7b85"} Oct 11 03:55:54 crc kubenswrapper[4703]: I1011 03:55:54.229832 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-xh9xx" event={"ID":"3e81d771-d34a-4a73-b0fd-0ea7e72f9e25","Type":"ContainerStarted","Data":"60a0985a9eb86c36badcc586c82117a8dd71e95cdaaa6e91811eec203071dc17"} Oct 11 03:55:54 crc kubenswrapper[4703]: I1011 03:55:54.229877 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-xh9xx" Oct 11 03:55:54 crc kubenswrapper[4703]: I1011 03:55:54.244360 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-q926p" event={"ID":"c456c157-fdbc-483e-b2ca-ceb0d2b6247b","Type":"ContainerStarted","Data":"13b31c34574a6e96a50df53469ea734f77e3f88d9cc2fd9e31f84f7868e3adc7"} Oct 11 03:55:54 crc kubenswrapper[4703]: I1011 03:55:54.285992 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-842hf" event={"ID":"4e1fed36-f837-4e3d-86fa-a4f05ef0a7e9","Type":"ContainerStarted","Data":"dd62b5666635bea7b9e5b2830b9f29a50c1e752143b266b6af00defcf0ac53d0"} Oct 11 03:55:54 crc kubenswrapper[4703]: I1011 03:55:54.304607 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-hrr65" event={"ID":"5938c6ea-4d1a-4ca7-bbc3-0bf2b1e9e6bb","Type":"ContainerStarted","Data":"161bc4c25b8d6f44a6772576462a33b4cc9e22cabc28e23dc1d5690b136d5a4d"} Oct 11 03:55:54 crc kubenswrapper[4703]: I1011 03:55:54.315870 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-b5vqb" event={"ID":"2c0331e0-7b42-4227-8d50-d4c66a927dea","Type":"ContainerStarted","Data":"28b7527edb52420476e67162ad4b0f9620a6e7ffb3a003a27eb45b363c18e052"} Oct 11 03:55:54 crc kubenswrapper[4703]: I1011 03:55:54.318907 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 11 03:55:54 crc kubenswrapper[4703]: E1011 03:55:54.322336 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-11 03:55:54.822311034 +0000 UTC m=+46.032792956 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 11 03:55:54 crc kubenswrapper[4703]: I1011 03:55:54.333169 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-b5mqn" event={"ID":"886a9919-144b-4a33-81ac-90b79512aa6a","Type":"ContainerStarted","Data":"e86d9b74590e853cbb6328f52fce81627e034859fa91f70e4ddea7c2426ed583"} Oct 11 03:55:54 crc kubenswrapper[4703]: I1011 03:55:54.346657 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-bcgcx" podStartSLOduration=18.34662603 podStartE2EDuration="18.34662603s" podCreationTimestamp="2025-10-11 03:55:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 03:55:54.341905654 +0000 UTC m=+45.552387576" watchObservedRunningTime="2025-10-11 03:55:54.34662603 +0000 UTC m=+45.557107952" Oct 11 03:55:54 crc kubenswrapper[4703]: I1011 03:55:54.350243 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4nf42" event={"ID":"5878306c-90e4-4924-b23c-90079c35d2dc","Type":"ContainerStarted","Data":"56eb4388fa2ca01bfdf524aedf59336e305b499e87322e14ad0aa8b7683edbf3"} Oct 11 03:55:54 crc kubenswrapper[4703]: I1011 03:55:54.350290 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4nf42" event={"ID":"5878306c-90e4-4924-b23c-90079c35d2dc","Type":"ContainerStarted","Data":"8c37427ef815ed0caed5b3f739990fedccbcc8c952ccabeb5bc2e9f26ef6b656"} Oct 11 03:55:54 crc kubenswrapper[4703]: I1011 03:55:54.362089 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-8wrsq" event={"ID":"95f6c15c-5422-4ef6-a4a8-108959afcae6","Type":"ContainerStarted","Data":"bf072ed08c296be731520927564ec2718ba78a8fa7a89f67032f0265f2519abe"} Oct 11 03:55:54 crc kubenswrapper[4703]: I1011 03:55:54.374395 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-xh9xx" podStartSLOduration=19.374379757 podStartE2EDuration="19.374379757s" podCreationTimestamp="2025-10-11 03:55:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 03:55:54.373549745 +0000 UTC m=+45.584031677" watchObservedRunningTime="2025-10-11 03:55:54.374379757 +0000 UTC m=+45.584861679" Oct 11 03:55:54 crc kubenswrapper[4703]: I1011 03:55:54.394752 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-2jlb9" event={"ID":"4e2ba5fe-e94f-4d40-8596-d4b9ad863215","Type":"ContainerStarted","Data":"8cb82d8a8070724678b9dba14b3acf029e5fff32c337c3d703e0103741b6505a"} Oct 11 03:55:54 crc kubenswrapper[4703]: I1011 03:55:54.399891 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-bdzmz" event={"ID":"b70b31eb-da80-4680-988d-7dd5d8ab1fe6","Type":"ContainerStarted","Data":"024d38100a43cb57b1f22b280e594f6fc1f6d4c639e2fe37d88b564ebeeeb16f"} Oct 11 03:55:54 crc kubenswrapper[4703]: I1011 03:55:54.400036 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-bdzmz" event={"ID":"b70b31eb-da80-4680-988d-7dd5d8ab1fe6","Type":"ContainerStarted","Data":"3287d7b371ebd629a4bcb3912f72fdf275e1268286458d465748dbcc8f3baf49"} Oct 11 03:55:54 crc kubenswrapper[4703]: I1011 03:55:54.404771 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4fv46" event={"ID":"cffc1277-9912-47dd-80a3-d724149e420c","Type":"ContainerStarted","Data":"ee153ca994b019e065539ac3db8dc4e5cf0ae117dc471d37f9e7d77d3ded0989"} Oct 11 03:55:54 crc kubenswrapper[4703]: I1011 03:55:54.410354 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-8wq6t" event={"ID":"291efcc3-da30-4a56-855c-8ee24e840b26","Type":"ContainerStarted","Data":"2035ce5eb5515119b1fc0909036a03f6a9bba5106c2546a1115cd55ef39b3e1c"} Oct 11 03:55:54 crc kubenswrapper[4703]: I1011 03:55:54.422275 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q44w9\" (UID: \"92095718-20ba-4b03-b949-e9a26009e283\") " pod="openshift-image-registry/image-registry-697d97f7c8-q44w9" Oct 11 03:55:54 crc kubenswrapper[4703]: I1011 03:55:54.423128 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-mlb5q" podStartSLOduration=18.423107283 podStartE2EDuration="18.423107283s" podCreationTimestamp="2025-10-11 03:55:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 03:55:54.422783805 +0000 UTC m=+45.633265727" watchObservedRunningTime="2025-10-11 03:55:54.423107283 +0000 UTC m=+45.633589205" Oct 11 03:55:54 crc kubenswrapper[4703]: E1011 03:55:54.423346 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-11 03:55:54.923328199 +0000 UTC m=+46.133810121 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q44w9" (UID: "92095718-20ba-4b03-b949-e9a26009e283") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 11 03:55:54 crc kubenswrapper[4703]: I1011 03:55:54.427884 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-tjd96" event={"ID":"c7664ffa-e88d-468b-b43d-3b3b6ec5195b","Type":"ContainerStarted","Data":"62f1f6e130fc31d991ac4dbcdd87ba1e3fb683ff9d40686001eeaa2c5906c4d6"} Oct 11 03:55:54 crc kubenswrapper[4703]: I1011 03:55:54.453554 4703 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-mlb5q" Oct 11 03:55:54 crc kubenswrapper[4703]: I1011 03:55:54.478533 4703 patch_prober.go:28] interesting pod/router-default-5444994796-mlb5q container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 11 03:55:54 crc kubenswrapper[4703]: [-]has-synced failed: reason withheld Oct 11 03:55:54 crc kubenswrapper[4703]: [+]process-running ok Oct 11 03:55:54 crc kubenswrapper[4703]: healthz check failed Oct 11 03:55:54 crc kubenswrapper[4703]: I1011 03:55:54.478590 4703 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-mlb5q" podUID="3516a340-0474-422a-bc3c-4424e08c17b0" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 11 03:55:54 crc kubenswrapper[4703]: I1011 03:55:54.499297 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-8whdq" event={"ID":"a9831579-a012-4c7d-a48c-2c0eee55cc36","Type":"ContainerStarted","Data":"ab96bc3730d075d81d469d9a1753b43b62ad86f620a37fe64a6fad72ecacd367"} Oct 11 03:55:54 crc kubenswrapper[4703]: I1011 03:55:54.499345 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-8whdq" event={"ID":"a9831579-a012-4c7d-a48c-2c0eee55cc36","Type":"ContainerStarted","Data":"87d19c2dcfe0f6472f893c651f1d375a5babee6451e8bf7b722a1a364a071c43"} Oct 11 03:55:54 crc kubenswrapper[4703]: I1011 03:55:54.502315 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-nt5pr" podStartSLOduration=18.502302058 podStartE2EDuration="18.502302058s" podCreationTimestamp="2025-10-11 03:55:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 03:55:54.462049398 +0000 UTC m=+45.672531320" watchObservedRunningTime="2025-10-11 03:55:54.502302058 +0000 UTC m=+45.712783980" Oct 11 03:55:54 crc kubenswrapper[4703]: I1011 03:55:54.502697 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/cni-sysctl-allowlist-ds-69n87" podStartSLOduration=5.502693689 podStartE2EDuration="5.502693689s" podCreationTimestamp="2025-10-11 03:55:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 03:55:54.501477726 +0000 UTC m=+45.711959648" watchObservedRunningTime="2025-10-11 03:55:54.502693689 +0000 UTC m=+45.713175611" Oct 11 03:55:54 crc kubenswrapper[4703]: I1011 03:55:54.523125 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 11 03:55:54 crc kubenswrapper[4703]: E1011 03:55:54.524206 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-11 03:55:55.02418811 +0000 UTC m=+46.234670032 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 11 03:55:54 crc kubenswrapper[4703]: I1011 03:55:54.530070 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-6rj5d" event={"ID":"e25f5a36-9626-41eb-82b0-6f33dfaf3001","Type":"ContainerStarted","Data":"6aad1dd4f1cef6f305ee581be068936b37e09f2761214bd27b7e6fa58970f0ec"} Oct 11 03:55:54 crc kubenswrapper[4703]: I1011 03:55:54.535572 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-9lb9d" podStartSLOduration=19.535555032 podStartE2EDuration="19.535555032s" podCreationTimestamp="2025-10-11 03:55:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 03:55:54.535308026 +0000 UTC m=+45.745789968" watchObservedRunningTime="2025-10-11 03:55:54.535555032 +0000 UTC m=+45.746036954" Oct 11 03:55:54 crc kubenswrapper[4703]: I1011 03:55:54.537262 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-m5qlx" event={"ID":"34cb015e-a7ba-4cea-a2fc-ab3b101299ce","Type":"ContainerStarted","Data":"398a76ffae29f334a1308efde666faa08aff2b7e0c4e0c09f0e3ef8a3ab8d348"} Oct 11 03:55:54 crc kubenswrapper[4703]: I1011 03:55:54.549209 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-w79s7" event={"ID":"0a9d5e17-f5b5-4f0e-81e2-4edbe41c4e9f","Type":"ContainerStarted","Data":"24b053b91571e2a69b37c763941dcc51e2fbdb43a6972c1bdd2ad42508e1bfd6"} Oct 11 03:55:54 crc kubenswrapper[4703]: I1011 03:55:54.549271 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-w79s7" event={"ID":"0a9d5e17-f5b5-4f0e-81e2-4edbe41c4e9f","Type":"ContainerStarted","Data":"3d01e3205d863cbf95237fc280936ebca56589a26b55ab8c85b784f8c46f3330"} Oct 11 03:55:54 crc kubenswrapper[4703]: I1011 03:55:54.553707 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-6588z" podStartSLOduration=5.553677904 podStartE2EDuration="5.553677904s" podCreationTimestamp="2025-10-11 03:55:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 03:55:54.549650067 +0000 UTC m=+45.760131989" watchObservedRunningTime="2025-10-11 03:55:54.553677904 +0000 UTC m=+45.764159836" Oct 11 03:55:54 crc kubenswrapper[4703]: I1011 03:55:54.557609 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-dr9rm" event={"ID":"0b2f916a-1f36-4ff0-96df-135bcfc2d5f8","Type":"ContainerStarted","Data":"b7df50ccac77fc62be58bc0cff46a813dd9deb317b96b556228cf2cb3f541322"} Oct 11 03:55:54 crc kubenswrapper[4703]: I1011 03:55:54.557655 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-dr9rm" event={"ID":"0b2f916a-1f36-4ff0-96df-135bcfc2d5f8","Type":"ContainerStarted","Data":"8a02ee0c8e4f9f7ea76f20b257dbc1d5c6de05c1e72f768f4d233f7f85cfbe3b"} Oct 11 03:55:54 crc kubenswrapper[4703]: I1011 03:55:54.558883 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-dr9rm" Oct 11 03:55:54 crc kubenswrapper[4703]: I1011 03:55:54.576930 4703 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-dr9rm container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.23:8443/healthz\": dial tcp 10.217.0.23:8443: connect: connection refused" start-of-body= Oct 11 03:55:54 crc kubenswrapper[4703]: I1011 03:55:54.577011 4703 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-dr9rm" podUID="0b2f916a-1f36-4ff0-96df-135bcfc2d5f8" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.23:8443/healthz\": dial tcp 10.217.0.23:8443: connect: connection refused" Oct 11 03:55:54 crc kubenswrapper[4703]: I1011 03:55:54.578956 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rz7sl" event={"ID":"9425edee-98a4-4086-8d31-28478469501b","Type":"ContainerStarted","Data":"00f1719cbb9cac6b8a28acdf95fc99575f9b12484244c0ecd5171cfafcfa8e86"} Oct 11 03:55:54 crc kubenswrapper[4703]: I1011 03:55:54.579886 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rz7sl" Oct 11 03:55:54 crc kubenswrapper[4703]: I1011 03:55:54.591532 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-bdzmz" podStartSLOduration=19.591497979 podStartE2EDuration="19.591497979s" podCreationTimestamp="2025-10-11 03:55:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 03:55:54.585841819 +0000 UTC m=+45.796323751" watchObservedRunningTime="2025-10-11 03:55:54.591497979 +0000 UTC m=+45.801979901" Oct 11 03:55:54 crc kubenswrapper[4703]: I1011 03:55:54.616614 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-728m8" event={"ID":"59fc390f-24d6-4f16-855b-475f50c6f4b0","Type":"ContainerStarted","Data":"67134bd6f46b7a8ce6de4d322401fb84139add36b5a4ab7eecf76b5b5f94fd11"} Oct 11 03:55:54 crc kubenswrapper[4703]: I1011 03:55:54.620586 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-rf7jf" event={"ID":"5b7cae24-659a-4bda-b925-d74698801dd9","Type":"ContainerStarted","Data":"06cabcfc0244592b06f0a271ae7454c800bac38068527b9a40fff050cf045115"} Oct 11 03:55:54 crc kubenswrapper[4703]: I1011 03:55:54.632100 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-75knh" event={"ID":"fd6eb215-d833-4726-b13c-54c53ca1e7e5","Type":"ContainerStarted","Data":"52bffd77a4546c40cdbf3ef1fe48c48025a75819e975a243c897cd29d2120f54"} Oct 11 03:55:54 crc kubenswrapper[4703]: I1011 03:55:54.632160 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-75knh" event={"ID":"fd6eb215-d833-4726-b13c-54c53ca1e7e5","Type":"ContainerStarted","Data":"98d83771c9617e24b5661a7ab962102e6c8f5b53046cdf1f9d1561f1c28dbd9f"} Oct 11 03:55:54 crc kubenswrapper[4703]: I1011 03:55:54.637616 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q44w9\" (UID: \"92095718-20ba-4b03-b949-e9a26009e283\") " pod="openshift-image-registry/image-registry-697d97f7c8-q44w9" Oct 11 03:55:54 crc kubenswrapper[4703]: I1011 03:55:54.638524 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rz7sl" Oct 11 03:55:54 crc kubenswrapper[4703]: E1011 03:55:54.643553 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-11 03:55:55.143531492 +0000 UTC m=+46.354013414 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q44w9" (UID: "92095718-20ba-4b03-b949-e9a26009e283") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 11 03:55:54 crc kubenswrapper[4703]: I1011 03:55:54.652490 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-pb92d" event={"ID":"9b135157-3c40-4f85-84a4-5c521ccbf1bd","Type":"ContainerStarted","Data":"786fb6976b4bffce8bb662e1dd71e2ad98f12e940b029457b3d4ed0245b7c1c8"} Oct 11 03:55:54 crc kubenswrapper[4703]: I1011 03:55:54.665399 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-6rj5d" podStartSLOduration=19.665367393 podStartE2EDuration="19.665367393s" podCreationTimestamp="2025-10-11 03:55:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 03:55:54.637303617 +0000 UTC m=+45.847785539" watchObservedRunningTime="2025-10-11 03:55:54.665367393 +0000 UTC m=+45.875849315" Oct 11 03:55:54 crc kubenswrapper[4703]: I1011 03:55:54.665691 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-m5qlx" podStartSLOduration=19.665687402 podStartE2EDuration="19.665687402s" podCreationTimestamp="2025-10-11 03:55:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 03:55:54.665094436 +0000 UTC m=+45.875576358" watchObservedRunningTime="2025-10-11 03:55:54.665687402 +0000 UTC m=+45.876169324" Oct 11 03:55:54 crc kubenswrapper[4703]: I1011 03:55:54.667410 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-dfgkf" event={"ID":"dfc1634d-27c3-4dd8-a8b4-0d9177961ee9","Type":"ContainerStarted","Data":"9ca4f177770232a59424de861cbd26538f5545d7ba8164cb18872da0ffb53812"} Oct 11 03:55:54 crc kubenswrapper[4703]: I1011 03:55:54.672821 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-4v67m" event={"ID":"34d64762-444d-41a0-bf17-d8de6712a90f","Type":"ContainerStarted","Data":"fe7b35edd31ca4c35b24a4da741ee0ccc15b99913b9ad8b074b0924cc09b48af"} Oct 11 03:55:54 crc kubenswrapper[4703]: I1011 03:55:54.690827 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-q8rmz" event={"ID":"62aed52a-c801-4ec0-b96b-6132de0e4200","Type":"ContainerStarted","Data":"f99eaf12e515d110b24b858f7aafbd9a0b94b2542aa4a861e2fb80f8f550eefd"} Oct 11 03:55:54 crc kubenswrapper[4703]: I1011 03:55:54.719016 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-psx9s" event={"ID":"8fa6f46d-74a5-4e8b-b112-55ed455b5409","Type":"ContainerStarted","Data":"dc5f167b6b277f4715a223d25927c72102b26c00df500ac70d70381e0dd18347"} Oct 11 03:55:54 crc kubenswrapper[4703]: I1011 03:55:54.747961 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 11 03:55:54 crc kubenswrapper[4703]: E1011 03:55:54.748251 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-11 03:55:55.248226756 +0000 UTC m=+46.458708678 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 11 03:55:54 crc kubenswrapper[4703]: I1011 03:55:54.749132 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q44w9\" (UID: \"92095718-20ba-4b03-b949-e9a26009e283\") " pod="openshift-image-registry/image-registry-697d97f7c8-q44w9" Oct 11 03:55:54 crc kubenswrapper[4703]: E1011 03:55:54.750899 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-11 03:55:55.250887287 +0000 UTC m=+46.461369209 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q44w9" (UID: "92095718-20ba-4b03-b949-e9a26009e283") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 11 03:55:54 crc kubenswrapper[4703]: I1011 03:55:54.761204 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-w79s7" podStartSLOduration=18.76118477 podStartE2EDuration="18.76118477s" podCreationTimestamp="2025-10-11 03:55:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 03:55:54.696785828 +0000 UTC m=+45.907267750" watchObservedRunningTime="2025-10-11 03:55:54.76118477 +0000 UTC m=+45.971666692" Oct 11 03:55:54 crc kubenswrapper[4703]: I1011 03:55:54.762204 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-8whdq" podStartSLOduration=18.762197747 podStartE2EDuration="18.762197747s" podCreationTimestamp="2025-10-11 03:55:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 03:55:54.761871459 +0000 UTC m=+45.972353391" watchObservedRunningTime="2025-10-11 03:55:54.762197747 +0000 UTC m=+45.972679669" Oct 11 03:55:54 crc kubenswrapper[4703]: I1011 03:55:54.781314 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-vmhjk" event={"ID":"b7f45d1c-1e38-42a4-a4e6-b682787d73cd","Type":"ContainerStarted","Data":"e1ce57397beece1c76ec3d338c23fa1c199fb257e83f3c4f50cbb71db7823f5a"} Oct 11 03:55:54 crc kubenswrapper[4703]: I1011 03:55:54.781418 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-vmhjk" Oct 11 03:55:54 crc kubenswrapper[4703]: I1011 03:55:54.800940 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-q8rmz" podStartSLOduration=19.800923457 podStartE2EDuration="19.800923457s" podCreationTimestamp="2025-10-11 03:55:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 03:55:54.799083947 +0000 UTC m=+46.009565869" watchObservedRunningTime="2025-10-11 03:55:54.800923457 +0000 UTC m=+46.011405379" Oct 11 03:55:54 crc kubenswrapper[4703]: I1011 03:55:54.824874 4703 patch_prober.go:28] interesting pod/console-operator-58897d9998-vmhjk container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.9:8443/readyz\": dial tcp 10.217.0.9:8443: connect: connection refused" start-of-body= Oct 11 03:55:54 crc kubenswrapper[4703]: I1011 03:55:54.824932 4703 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-vmhjk" podUID="b7f45d1c-1e38-42a4-a4e6-b682787d73cd" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.9:8443/readyz\": dial tcp 10.217.0.9:8443: connect: connection refused" Oct 11 03:55:54 crc kubenswrapper[4703]: I1011 03:55:54.837442 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-bz6vn" event={"ID":"be7d6fb4-1943-4d7f-87a8-ce906ed31cf0","Type":"ContainerStarted","Data":"7662e5490134c9c4bf21ca70e3bab938a356f4cd0598d63d768994f46ac826c6"} Oct 11 03:55:54 crc kubenswrapper[4703]: I1011 03:55:54.856491 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-9lb9d" Oct 11 03:55:54 crc kubenswrapper[4703]: I1011 03:55:54.861829 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 11 03:55:54 crc kubenswrapper[4703]: E1011 03:55:54.863119 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-11 03:55:55.36310067 +0000 UTC m=+46.573582592 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 11 03:55:54 crc kubenswrapper[4703]: I1011 03:55:54.886614 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-dr9rm" podStartSLOduration=18.886595774 podStartE2EDuration="18.886595774s" podCreationTimestamp="2025-10-11 03:55:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 03:55:54.835933238 +0000 UTC m=+46.046415160" watchObservedRunningTime="2025-10-11 03:55:54.886595774 +0000 UTC m=+46.097077686" Oct 11 03:55:54 crc kubenswrapper[4703]: I1011 03:55:54.931637 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rz7sl" podStartSLOduration=18.931617961 podStartE2EDuration="18.931617961s" podCreationTimestamp="2025-10-11 03:55:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 03:55:54.928898799 +0000 UTC m=+46.139380721" watchObservedRunningTime="2025-10-11 03:55:54.931617961 +0000 UTC m=+46.142099883" Oct 11 03:55:54 crc kubenswrapper[4703]: I1011 03:55:54.963692 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q44w9\" (UID: \"92095718-20ba-4b03-b949-e9a26009e283\") " pod="openshift-image-registry/image-registry-697d97f7c8-q44w9" Oct 11 03:55:54 crc kubenswrapper[4703]: E1011 03:55:54.965845 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-11 03:55:55.46582221 +0000 UTC m=+46.676304132 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q44w9" (UID: "92095718-20ba-4b03-b949-e9a26009e283") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 11 03:55:54 crc kubenswrapper[4703]: I1011 03:55:54.983171 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-vmhjk" podStartSLOduration=19.983140611 podStartE2EDuration="19.983140611s" podCreationTimestamp="2025-10-11 03:55:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 03:55:54.967918496 +0000 UTC m=+46.178400418" watchObservedRunningTime="2025-10-11 03:55:54.983140611 +0000 UTC m=+46.193622533" Oct 11 03:55:55 crc kubenswrapper[4703]: I1011 03:55:55.073585 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 11 03:55:55 crc kubenswrapper[4703]: E1011 03:55:55.073939 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-11 03:55:55.573923985 +0000 UTC m=+46.784405907 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 11 03:55:55 crc kubenswrapper[4703]: I1011 03:55:55.180044 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q44w9\" (UID: \"92095718-20ba-4b03-b949-e9a26009e283\") " pod="openshift-image-registry/image-registry-697d97f7c8-q44w9" Oct 11 03:55:55 crc kubenswrapper[4703]: E1011 03:55:55.180859 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-11 03:55:55.680837976 +0000 UTC m=+46.891319898 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q44w9" (UID: "92095718-20ba-4b03-b949-e9a26009e283") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 11 03:55:55 crc kubenswrapper[4703]: I1011 03:55:55.202458 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-69n87"] Oct 11 03:55:55 crc kubenswrapper[4703]: I1011 03:55:55.287525 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 11 03:55:55 crc kubenswrapper[4703]: E1011 03:55:55.287946 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-11 03:55:55.787918032 +0000 UTC m=+46.998399954 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 11 03:55:55 crc kubenswrapper[4703]: I1011 03:55:55.395838 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q44w9\" (UID: \"92095718-20ba-4b03-b949-e9a26009e283\") " pod="openshift-image-registry/image-registry-697d97f7c8-q44w9" Oct 11 03:55:55 crc kubenswrapper[4703]: E1011 03:55:55.396259 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-11 03:55:55.896247043 +0000 UTC m=+47.106728965 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q44w9" (UID: "92095718-20ba-4b03-b949-e9a26009e283") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 11 03:55:55 crc kubenswrapper[4703]: I1011 03:55:55.474674 4703 patch_prober.go:28] interesting pod/router-default-5444994796-mlb5q container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 11 03:55:55 crc kubenswrapper[4703]: [-]has-synced failed: reason withheld Oct 11 03:55:55 crc kubenswrapper[4703]: [+]process-running ok Oct 11 03:55:55 crc kubenswrapper[4703]: healthz check failed Oct 11 03:55:55 crc kubenswrapper[4703]: I1011 03:55:55.475038 4703 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-mlb5q" podUID="3516a340-0474-422a-bc3c-4424e08c17b0" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 11 03:55:55 crc kubenswrapper[4703]: I1011 03:55:55.499257 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 11 03:55:55 crc kubenswrapper[4703]: E1011 03:55:55.499597 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-11 03:55:55.99958088 +0000 UTC m=+47.210062802 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 11 03:55:55 crc kubenswrapper[4703]: I1011 03:55:55.605850 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q44w9\" (UID: \"92095718-20ba-4b03-b949-e9a26009e283\") " pod="openshift-image-registry/image-registry-697d97f7c8-q44w9" Oct 11 03:55:55 crc kubenswrapper[4703]: E1011 03:55:55.606162 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-11 03:55:56.106151003 +0000 UTC m=+47.316632925 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q44w9" (UID: "92095718-20ba-4b03-b949-e9a26009e283") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 11 03:55:55 crc kubenswrapper[4703]: I1011 03:55:55.709285 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 11 03:55:55 crc kubenswrapper[4703]: E1011 03:55:55.709661 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-11 03:55:56.209644545 +0000 UTC m=+47.420126467 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 11 03:55:55 crc kubenswrapper[4703]: I1011 03:55:55.815001 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q44w9\" (UID: \"92095718-20ba-4b03-b949-e9a26009e283\") " pod="openshift-image-registry/image-registry-697d97f7c8-q44w9" Oct 11 03:55:55 crc kubenswrapper[4703]: E1011 03:55:55.815633 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-11 03:55:56.315621551 +0000 UTC m=+47.526103463 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q44w9" (UID: "92095718-20ba-4b03-b949-e9a26009e283") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 11 03:55:55 crc kubenswrapper[4703]: I1011 03:55:55.863580 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-rf7jf" event={"ID":"5b7cae24-659a-4bda-b925-d74698801dd9","Type":"ContainerStarted","Data":"5c691008c2015de82743346c620eb9ef2eb7da3cdb34fcf972c99a96abbd95aa"} Oct 11 03:55:55 crc kubenswrapper[4703]: I1011 03:55:55.879739 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-bz6vn" event={"ID":"be7d6fb4-1943-4d7f-87a8-ce906ed31cf0","Type":"ContainerStarted","Data":"53a97d2ac6def2498603c9f3ea25b9fc5b7086d04787e3bb2e81ab040d82c7d4"} Oct 11 03:55:55 crc kubenswrapper[4703]: I1011 03:55:55.880807 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-bz6vn" Oct 11 03:55:55 crc kubenswrapper[4703]: I1011 03:55:55.889692 4703 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-bz6vn container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.21:6443/healthz\": dial tcp 10.217.0.21:6443: connect: connection refused" start-of-body= Oct 11 03:55:55 crc kubenswrapper[4703]: I1011 03:55:55.889772 4703 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-bz6vn" podUID="be7d6fb4-1943-4d7f-87a8-ce906ed31cf0" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.21:6443/healthz\": dial tcp 10.217.0.21:6443: connect: connection refused" Oct 11 03:55:55 crc kubenswrapper[4703]: I1011 03:55:55.916419 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 11 03:55:55 crc kubenswrapper[4703]: E1011 03:55:55.917610 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-11 03:55:56.417590662 +0000 UTC m=+47.628072584 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 11 03:55:55 crc kubenswrapper[4703]: I1011 03:55:55.919787 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-4v67m" event={"ID":"34d64762-444d-41a0-bf17-d8de6712a90f","Type":"ContainerStarted","Data":"9d7bbeba470ff0345eb861bdccc4bbb80135d11ca605b04f5d51a46e4aebe5d8"} Oct 11 03:55:55 crc kubenswrapper[4703]: I1011 03:55:55.947741 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-bz6vn" podStartSLOduration=20.947725674 podStartE2EDuration="20.947725674s" podCreationTimestamp="2025-10-11 03:55:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 03:55:55.946007597 +0000 UTC m=+47.156489519" watchObservedRunningTime="2025-10-11 03:55:55.947725674 +0000 UTC m=+47.158207596" Oct 11 03:55:55 crc kubenswrapper[4703]: I1011 03:55:55.959933 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-psx9s" event={"ID":"8fa6f46d-74a5-4e8b-b112-55ed455b5409","Type":"ContainerStarted","Data":"148b03e7cee0fc3cfac8ae6f264c4e3e62bb4f2ad99d77cbab194dd585e6c840"} Oct 11 03:55:56 crc kubenswrapper[4703]: I1011 03:55:56.000675 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-z7wfl" event={"ID":"0b4045ea-f5af-4576-9ada-baddba2cc319","Type":"ContainerStarted","Data":"a7fcde20153838943598380946de85e0205a46233e05e8e67d7fe36bb706648f"} Oct 11 03:55:56 crc kubenswrapper[4703]: I1011 03:55:56.001760 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-z7wfl" Oct 11 03:55:56 crc kubenswrapper[4703]: I1011 03:55:56.004857 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-psx9s" podStartSLOduration=20.004846111 podStartE2EDuration="20.004846111s" podCreationTimestamp="2025-10-11 03:55:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 03:55:56.003720722 +0000 UTC m=+47.214202654" watchObservedRunningTime="2025-10-11 03:55:56.004846111 +0000 UTC m=+47.215328033" Oct 11 03:55:56 crc kubenswrapper[4703]: I1011 03:55:56.020897 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q44w9\" (UID: \"92095718-20ba-4b03-b949-e9a26009e283\") " pod="openshift-image-registry/image-registry-697d97f7c8-q44w9" Oct 11 03:55:56 crc kubenswrapper[4703]: E1011 03:55:56.021329 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-11 03:55:56.521310189 +0000 UTC m=+47.731792201 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q44w9" (UID: "92095718-20ba-4b03-b949-e9a26009e283") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 11 03:55:56 crc kubenswrapper[4703]: I1011 03:55:56.044563 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-z7wfl" podStartSLOduration=20.044547827 podStartE2EDuration="20.044547827s" podCreationTimestamp="2025-10-11 03:55:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 03:55:56.042852002 +0000 UTC m=+47.253333924" watchObservedRunningTime="2025-10-11 03:55:56.044547827 +0000 UTC m=+47.255029749" Oct 11 03:55:56 crc kubenswrapper[4703]: I1011 03:55:56.050664 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-z7wfl" Oct 11 03:55:56 crc kubenswrapper[4703]: I1011 03:55:56.078708 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vmsw5" event={"ID":"beb46851-cb80-4329-85b4-c2c87ebeb836","Type":"ContainerStarted","Data":"0b2427475e5a37a29373d86efc3c957b0d2bcce9510e7297d5fefe077f96dd79"} Oct 11 03:55:56 crc kubenswrapper[4703]: I1011 03:55:56.078754 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vmsw5" event={"ID":"beb46851-cb80-4329-85b4-c2c87ebeb836","Type":"ContainerStarted","Data":"dc5c37eae9f37f718bbfeceb9811cefad81061ac513efc0e1ae703e5fa8e83fa"} Oct 11 03:55:56 crc kubenswrapper[4703]: I1011 03:55:56.079695 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vmsw5" Oct 11 03:55:56 crc kubenswrapper[4703]: I1011 03:55:56.082483 4703 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-vmsw5 container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.26:5443/healthz\": dial tcp 10.217.0.26:5443: connect: connection refused" start-of-body= Oct 11 03:55:56 crc kubenswrapper[4703]: I1011 03:55:56.082534 4703 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vmsw5" podUID="beb46851-cb80-4329-85b4-c2c87ebeb836" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.26:5443/healthz\": dial tcp 10.217.0.26:5443: connect: connection refused" Oct 11 03:55:56 crc kubenswrapper[4703]: I1011 03:55:56.104839 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-h9vv2" event={"ID":"be037531-a087-4b3e-9327-f28d38fd3961","Type":"ContainerStarted","Data":"80132933fbf23d9e3dd22dd174540342df0ca5bb888171fa4e8059c26dd9b384"} Oct 11 03:55:56 crc kubenswrapper[4703]: I1011 03:55:56.104952 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-h9vv2" event={"ID":"be037531-a087-4b3e-9327-f28d38fd3961","Type":"ContainerStarted","Data":"cd0ea342256cbe840e6b4177326ae5a76373a06b86fd35ec03e879d4afee199e"} Oct 11 03:55:56 crc kubenswrapper[4703]: I1011 03:55:56.124963 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 11 03:55:56 crc kubenswrapper[4703]: E1011 03:55:56.126089 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-11 03:55:56.626070274 +0000 UTC m=+47.836552186 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 11 03:55:56 crc kubenswrapper[4703]: I1011 03:55:56.135375 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-b5mqn" event={"ID":"886a9919-144b-4a33-81ac-90b79512aa6a","Type":"ContainerStarted","Data":"a7c4da02aaefa1c98026366cfc74a39d955ad9b61f53b2b342aae28a2637551e"} Oct 11 03:55:56 crc kubenswrapper[4703]: I1011 03:55:56.135625 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-b5mqn" event={"ID":"886a9919-144b-4a33-81ac-90b79512aa6a","Type":"ContainerStarted","Data":"c11d68c02b4710548863ed0a7e13a49c5ca738250aaf4335c3c45c164ba5667c"} Oct 11 03:55:56 crc kubenswrapper[4703]: I1011 03:55:56.156389 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vmsw5" podStartSLOduration=20.15637174 podStartE2EDuration="20.15637174s" podCreationTimestamp="2025-10-11 03:55:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 03:55:56.123962919 +0000 UTC m=+47.334444851" watchObservedRunningTime="2025-10-11 03:55:56.15637174 +0000 UTC m=+47.366853662" Oct 11 03:55:56 crc kubenswrapper[4703]: I1011 03:55:56.179704 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4nf42" event={"ID":"5878306c-90e4-4924-b23c-90079c35d2dc","Type":"ContainerStarted","Data":"722096aa07411db9fc3d1139d13323d088ab5d39914ca1ae06da4105662c4f4c"} Oct 11 03:55:56 crc kubenswrapper[4703]: I1011 03:55:56.204116 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-8wq6t" event={"ID":"291efcc3-da30-4a56-855c-8ee24e840b26","Type":"ContainerStarted","Data":"f80d1053ce0d84e34cc16183f38ed3922a7e3b477414c65273258a28bb0bbd79"} Oct 11 03:55:56 crc kubenswrapper[4703]: I1011 03:55:56.205262 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-h9vv2" podStartSLOduration=21.20524655 podStartE2EDuration="21.20524655s" podCreationTimestamp="2025-10-11 03:55:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 03:55:56.15824101 +0000 UTC m=+47.368722932" watchObservedRunningTime="2025-10-11 03:55:56.20524655 +0000 UTC m=+47.415728472" Oct 11 03:55:56 crc kubenswrapper[4703]: I1011 03:55:56.206344 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-b5mqn" podStartSLOduration=20.206339048 podStartE2EDuration="20.206339048s" podCreationTimestamp="2025-10-11 03:55:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 03:55:56.205045413 +0000 UTC m=+47.415527335" watchObservedRunningTime="2025-10-11 03:55:56.206339048 +0000 UTC m=+47.416820970" Oct 11 03:55:56 crc kubenswrapper[4703]: I1011 03:55:56.226201 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q44w9\" (UID: \"92095718-20ba-4b03-b949-e9a26009e283\") " pod="openshift-image-registry/image-registry-697d97f7c8-q44w9" Oct 11 03:55:56 crc kubenswrapper[4703]: E1011 03:55:56.228395 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-11 03:55:56.728384254 +0000 UTC m=+47.938866176 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q44w9" (UID: "92095718-20ba-4b03-b949-e9a26009e283") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 11 03:55:56 crc kubenswrapper[4703]: I1011 03:55:56.247323 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4nf42" podStartSLOduration=21.247308538 podStartE2EDuration="21.247308538s" podCreationTimestamp="2025-10-11 03:55:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 03:55:56.246882816 +0000 UTC m=+47.457364748" watchObservedRunningTime="2025-10-11 03:55:56.247308538 +0000 UTC m=+47.457790460" Oct 11 03:55:56 crc kubenswrapper[4703]: I1011 03:55:56.317259 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-75knh" event={"ID":"fd6eb215-d833-4726-b13c-54c53ca1e7e5","Type":"ContainerStarted","Data":"74a1bba0281f7e930db305b4e0f8cf90c24111aab2632ddaab5e140d1fd80e4c"} Oct 11 03:55:56 crc kubenswrapper[4703]: I1011 03:55:56.319584 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-75knh" Oct 11 03:55:56 crc kubenswrapper[4703]: I1011 03:55:56.320097 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-8wq6t" podStartSLOduration=20.320076882 podStartE2EDuration="20.320076882s" podCreationTimestamp="2025-10-11 03:55:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 03:55:56.319180698 +0000 UTC m=+47.529662620" watchObservedRunningTime="2025-10-11 03:55:56.320076882 +0000 UTC m=+47.530558804" Oct 11 03:55:56 crc kubenswrapper[4703]: I1011 03:55:56.329956 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 11 03:55:56 crc kubenswrapper[4703]: E1011 03:55:56.330669 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-11 03:55:56.830634502 +0000 UTC m=+48.041116424 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 11 03:55:56 crc kubenswrapper[4703]: I1011 03:55:56.351167 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-q926p" event={"ID":"c456c157-fdbc-483e-b2ca-ceb0d2b6247b","Type":"ContainerStarted","Data":"a2bda9d03e4d42b9a0f259c0744e8445bf6bc971ae80c099c724a6e2f4467af8"} Oct 11 03:55:56 crc kubenswrapper[4703]: I1011 03:55:56.362909 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-75knh" podStartSLOduration=20.36289343 podStartE2EDuration="20.36289343s" podCreationTimestamp="2025-10-11 03:55:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 03:55:56.360239639 +0000 UTC m=+47.570721561" watchObservedRunningTime="2025-10-11 03:55:56.36289343 +0000 UTC m=+47.573375352" Oct 11 03:55:56 crc kubenswrapper[4703]: I1011 03:55:56.406672 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-vmhjk" event={"ID":"b7f45d1c-1e38-42a4-a4e6-b682787d73cd","Type":"ContainerStarted","Data":"e24aa342ef82a91de617a33aed89e0ca719cc2b7bc23f891ec9ea858b412da32"} Oct 11 03:55:56 crc kubenswrapper[4703]: I1011 03:55:56.434382 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q44w9\" (UID: \"92095718-20ba-4b03-b949-e9a26009e283\") " pod="openshift-image-registry/image-registry-697d97f7c8-q44w9" Oct 11 03:55:56 crc kubenswrapper[4703]: E1011 03:55:56.435625 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-11 03:55:56.935614113 +0000 UTC m=+48.146096025 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q44w9" (UID: "92095718-20ba-4b03-b949-e9a26009e283") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 11 03:55:56 crc kubenswrapper[4703]: I1011 03:55:56.445421 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-hrr65" event={"ID":"5938c6ea-4d1a-4ca7-bbc3-0bf2b1e9e6bb","Type":"ContainerStarted","Data":"ce884d3301e54a1ea144fb1f373783940612c53793e41a053428517c344039c0"} Oct 11 03:55:56 crc kubenswrapper[4703]: I1011 03:55:56.446986 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-q926p" podStartSLOduration=21.446971495 podStartE2EDuration="21.446971495s" podCreationTimestamp="2025-10-11 03:55:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 03:55:56.445887986 +0000 UTC m=+47.656369908" watchObservedRunningTime="2025-10-11 03:55:56.446971495 +0000 UTC m=+47.657453417" Oct 11 03:55:56 crc kubenswrapper[4703]: I1011 03:55:56.462718 4703 patch_prober.go:28] interesting pod/router-default-5444994796-mlb5q container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 11 03:55:56 crc kubenswrapper[4703]: [-]has-synced failed: reason withheld Oct 11 03:55:56 crc kubenswrapper[4703]: [+]process-running ok Oct 11 03:55:56 crc kubenswrapper[4703]: healthz check failed Oct 11 03:55:56 crc kubenswrapper[4703]: I1011 03:55:56.462787 4703 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-mlb5q" podUID="3516a340-0474-422a-bc3c-4424e08c17b0" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 11 03:55:56 crc kubenswrapper[4703]: I1011 03:55:56.474538 4703 generic.go:334] "Generic (PLEG): container finished" podID="cffc1277-9912-47dd-80a3-d724149e420c" containerID="cd5227b257e0ca8e1e90a863283bbe0f9d0feec0aa5cb78816cfc137d0a5e0fb" exitCode=0 Oct 11 03:55:56 crc kubenswrapper[4703]: I1011 03:55:56.474742 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4fv46" event={"ID":"cffc1277-9912-47dd-80a3-d724149e420c","Type":"ContainerDied","Data":"cd5227b257e0ca8e1e90a863283bbe0f9d0feec0aa5cb78816cfc137d0a5e0fb"} Oct 11 03:55:56 crc kubenswrapper[4703]: I1011 03:55:56.500766 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-842hf" event={"ID":"4e1fed36-f837-4e3d-86fa-a4f05ef0a7e9","Type":"ContainerStarted","Data":"ed77a774e623a28016c97322b9a5a88744bdd9f548b7b52d9151f6c7b3de1383"} Oct 11 03:55:56 crc kubenswrapper[4703]: I1011 03:55:56.526539 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-dfgkf" event={"ID":"dfc1634d-27c3-4dd8-a8b4-0d9177961ee9","Type":"ContainerStarted","Data":"692df5f7ab2d655be6f6e355f2a8d2c8c68a3a78511e4d62036f83c160c4c1c8"} Oct 11 03:55:56 crc kubenswrapper[4703]: I1011 03:55:56.529869 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29335905-4jslx" event={"ID":"83320069-4d0c-4e40-8a7f-b3bc4beda7a1","Type":"ContainerStarted","Data":"2eb9f8e6d465e4ccdd205ae0aeeca09ce9e2af9ecf85617551218ef952761320"} Oct 11 03:55:56 crc kubenswrapper[4703]: I1011 03:55:56.530431 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-hrr65" podStartSLOduration=20.530419854 podStartE2EDuration="20.530419854s" podCreationTimestamp="2025-10-11 03:55:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 03:55:56.500386325 +0000 UTC m=+47.710868257" watchObservedRunningTime="2025-10-11 03:55:56.530419854 +0000 UTC m=+47.740901776" Oct 11 03:55:56 crc kubenswrapper[4703]: I1011 03:55:56.530716 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-g6lsz"] Oct 11 03:55:56 crc kubenswrapper[4703]: I1011 03:55:56.531589 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-tjd96" Oct 11 03:55:56 crc kubenswrapper[4703]: I1011 03:55:56.531609 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-tjd96" event={"ID":"c7664ffa-e88d-468b-b43d-3b3b6ec5195b","Type":"ContainerStarted","Data":"80d97fd1e1b79312dfd7c45fec2f0c1ed667fdead01a11b72e3cbf7c483d149c"} Oct 11 03:55:56 crc kubenswrapper[4703]: I1011 03:55:56.531726 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-g6lsz" Oct 11 03:55:56 crc kubenswrapper[4703]: I1011 03:55:56.536360 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 11 03:55:56 crc kubenswrapper[4703]: I1011 03:55:56.536881 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Oct 11 03:55:56 crc kubenswrapper[4703]: I1011 03:55:56.536981 4703 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-tjd96 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.43:8080/healthz\": dial tcp 10.217.0.43:8080: connect: connection refused" start-of-body= Oct 11 03:55:56 crc kubenswrapper[4703]: I1011 03:55:56.537050 4703 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-tjd96" podUID="c7664ffa-e88d-468b-b43d-3b3b6ec5195b" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.43:8080/healthz\": dial tcp 10.217.0.43:8080: connect: connection refused" Oct 11 03:55:56 crc kubenswrapper[4703]: E1011 03:55:56.537439 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-11 03:55:57.03742035 +0000 UTC m=+48.247902272 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 11 03:55:56 crc kubenswrapper[4703]: I1011 03:55:56.564189 4703 generic.go:334] "Generic (PLEG): container finished" podID="9b135157-3c40-4f85-84a4-5c521ccbf1bd" containerID="fbabf9ae78d3845786a8744b3c142340eedb71f2070721d38143b8062cc47ba1" exitCode=0 Oct 11 03:55:56 crc kubenswrapper[4703]: I1011 03:55:56.564286 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-pb92d" event={"ID":"9b135157-3c40-4f85-84a4-5c521ccbf1bd","Type":"ContainerDied","Data":"fbabf9ae78d3845786a8744b3c142340eedb71f2070721d38143b8062cc47ba1"} Oct 11 03:55:56 crc kubenswrapper[4703]: I1011 03:55:56.566203 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-g6lsz"] Oct 11 03:55:56 crc kubenswrapper[4703]: I1011 03:55:56.586516 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-8wrsq" event={"ID":"95f6c15c-5422-4ef6-a4a8-108959afcae6","Type":"ContainerStarted","Data":"4f28c2583e895e06382a45f75416d5f565ffa1c0179c44359a6c5678d0ded47c"} Oct 11 03:55:56 crc kubenswrapper[4703]: I1011 03:55:56.609522 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-842hf" podStartSLOduration=21.588448956 podStartE2EDuration="21.588448956s" podCreationTimestamp="2025-10-11 03:55:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 03:55:56.585908318 +0000 UTC m=+47.796390240" watchObservedRunningTime="2025-10-11 03:55:56.588448956 +0000 UTC m=+47.798930878" Oct 11 03:55:56 crc kubenswrapper[4703]: I1011 03:55:56.612923 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-2jlb9" event={"ID":"4e2ba5fe-e94f-4d40-8596-d4b9ad863215","Type":"ContainerStarted","Data":"fe022d5d09a897d1a866c53c1bf3e8ecbec4b025a84927ca34b620217979d5b0"} Oct 11 03:55:56 crc kubenswrapper[4703]: I1011 03:55:56.627166 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-b5vqb" event={"ID":"2c0331e0-7b42-4227-8d50-d4c66a927dea","Type":"ContainerStarted","Data":"012c7e4c177d7569dd7c84b8a0a92afab29c79a6f0bbe78f482f9d7f7dfd7873"} Oct 11 03:55:56 crc kubenswrapper[4703]: I1011 03:55:56.638191 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q44w9\" (UID: \"92095718-20ba-4b03-b949-e9a26009e283\") " pod="openshift-image-registry/image-registry-697d97f7c8-q44w9" Oct 11 03:55:56 crc kubenswrapper[4703]: E1011 03:55:56.638596 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-11 03:55:57.138580709 +0000 UTC m=+48.349062631 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q44w9" (UID: "92095718-20ba-4b03-b949-e9a26009e283") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 11 03:55:56 crc kubenswrapper[4703]: I1011 03:55:56.659861 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-xh9xx" Oct 11 03:55:56 crc kubenswrapper[4703]: I1011 03:55:56.667117 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-dr9rm" Oct 11 03:55:56 crc kubenswrapper[4703]: I1011 03:55:56.667942 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29335905-4jslx" podStartSLOduration=21.667919819 podStartE2EDuration="21.667919819s" podCreationTimestamp="2025-10-11 03:55:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 03:55:56.667538589 +0000 UTC m=+47.878020521" watchObservedRunningTime="2025-10-11 03:55:56.667919819 +0000 UTC m=+47.878401741" Oct 11 03:55:56 crc kubenswrapper[4703]: I1011 03:55:56.696839 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-tjd96" podStartSLOduration=20.696825568 podStartE2EDuration="20.696825568s" podCreationTimestamp="2025-10-11 03:55:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 03:55:56.694225368 +0000 UTC m=+47.904707290" watchObservedRunningTime="2025-10-11 03:55:56.696825568 +0000 UTC m=+47.907307490" Oct 11 03:55:56 crc kubenswrapper[4703]: I1011 03:55:56.726437 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-z4mh8"] Oct 11 03:55:56 crc kubenswrapper[4703]: I1011 03:55:56.727802 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-z4mh8" Oct 11 03:55:56 crc kubenswrapper[4703]: I1011 03:55:56.731972 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Oct 11 03:55:56 crc kubenswrapper[4703]: I1011 03:55:56.740200 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 11 03:55:56 crc kubenswrapper[4703]: I1011 03:55:56.740470 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-69q9q\" (UniqueName: \"kubernetes.io/projected/27051689-7c44-4841-bcbc-7118a66ae0a0-kube-api-access-69q9q\") pod \"community-operators-g6lsz\" (UID: \"27051689-7c44-4841-bcbc-7118a66ae0a0\") " pod="openshift-marketplace/community-operators-g6lsz" Oct 11 03:55:56 crc kubenswrapper[4703]: E1011 03:55:56.740540 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-11 03:55:57.240514029 +0000 UTC m=+48.450995951 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 11 03:55:56 crc kubenswrapper[4703]: I1011 03:55:56.741768 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q44w9\" (UID: \"92095718-20ba-4b03-b949-e9a26009e283\") " pod="openshift-image-registry/image-registry-697d97f7c8-q44w9" Oct 11 03:55:56 crc kubenswrapper[4703]: I1011 03:55:56.741909 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/27051689-7c44-4841-bcbc-7118a66ae0a0-utilities\") pod \"community-operators-g6lsz\" (UID: \"27051689-7c44-4841-bcbc-7118a66ae0a0\") " pod="openshift-marketplace/community-operators-g6lsz" Oct 11 03:55:56 crc kubenswrapper[4703]: I1011 03:55:56.742214 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/27051689-7c44-4841-bcbc-7118a66ae0a0-catalog-content\") pod \"community-operators-g6lsz\" (UID: \"27051689-7c44-4841-bcbc-7118a66ae0a0\") " pod="openshift-marketplace/community-operators-g6lsz" Oct 11 03:55:56 crc kubenswrapper[4703]: E1011 03:55:56.746692 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-11 03:55:57.246665912 +0000 UTC m=+48.457147834 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q44w9" (UID: "92095718-20ba-4b03-b949-e9a26009e283") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 11 03:55:56 crc kubenswrapper[4703]: I1011 03:55:56.748384 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-dfgkf" podStartSLOduration=21.748359338 podStartE2EDuration="21.748359338s" podCreationTimestamp="2025-10-11 03:55:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 03:55:56.732397963 +0000 UTC m=+47.942879875" watchObservedRunningTime="2025-10-11 03:55:56.748359338 +0000 UTC m=+47.958841260" Oct 11 03:55:56 crc kubenswrapper[4703]: I1011 03:55:56.751047 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-z4mh8"] Oct 11 03:55:56 crc kubenswrapper[4703]: I1011 03:55:56.845287 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 11 03:55:56 crc kubenswrapper[4703]: I1011 03:55:56.845607 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/27051689-7c44-4841-bcbc-7118a66ae0a0-utilities\") pod \"community-operators-g6lsz\" (UID: \"27051689-7c44-4841-bcbc-7118a66ae0a0\") " pod="openshift-marketplace/community-operators-g6lsz" Oct 11 03:55:56 crc kubenswrapper[4703]: I1011 03:55:56.845651 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/27051689-7c44-4841-bcbc-7118a66ae0a0-catalog-content\") pod \"community-operators-g6lsz\" (UID: \"27051689-7c44-4841-bcbc-7118a66ae0a0\") " pod="openshift-marketplace/community-operators-g6lsz" Oct 11 03:55:56 crc kubenswrapper[4703]: I1011 03:55:56.845693 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9301ceea-042d-4d8d-941c-54032cf80047-catalog-content\") pod \"certified-operators-z4mh8\" (UID: \"9301ceea-042d-4d8d-941c-54032cf80047\") " pod="openshift-marketplace/certified-operators-z4mh8" Oct 11 03:55:56 crc kubenswrapper[4703]: I1011 03:55:56.845745 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-69q9q\" (UniqueName: \"kubernetes.io/projected/27051689-7c44-4841-bcbc-7118a66ae0a0-kube-api-access-69q9q\") pod \"community-operators-g6lsz\" (UID: \"27051689-7c44-4841-bcbc-7118a66ae0a0\") " pod="openshift-marketplace/community-operators-g6lsz" Oct 11 03:55:56 crc kubenswrapper[4703]: I1011 03:55:56.845772 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6z74n\" (UniqueName: \"kubernetes.io/projected/9301ceea-042d-4d8d-941c-54032cf80047-kube-api-access-6z74n\") pod \"certified-operators-z4mh8\" (UID: \"9301ceea-042d-4d8d-941c-54032cf80047\") " pod="openshift-marketplace/certified-operators-z4mh8" Oct 11 03:55:56 crc kubenswrapper[4703]: I1011 03:55:56.845799 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9301ceea-042d-4d8d-941c-54032cf80047-utilities\") pod \"certified-operators-z4mh8\" (UID: \"9301ceea-042d-4d8d-941c-54032cf80047\") " pod="openshift-marketplace/certified-operators-z4mh8" Oct 11 03:55:56 crc kubenswrapper[4703]: E1011 03:55:56.845949 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-11 03:55:57.345927811 +0000 UTC m=+48.556409733 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 11 03:55:56 crc kubenswrapper[4703]: I1011 03:55:56.846289 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/27051689-7c44-4841-bcbc-7118a66ae0a0-utilities\") pod \"community-operators-g6lsz\" (UID: \"27051689-7c44-4841-bcbc-7118a66ae0a0\") " pod="openshift-marketplace/community-operators-g6lsz" Oct 11 03:55:56 crc kubenswrapper[4703]: I1011 03:55:56.846519 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/27051689-7c44-4841-bcbc-7118a66ae0a0-catalog-content\") pod \"community-operators-g6lsz\" (UID: \"27051689-7c44-4841-bcbc-7118a66ae0a0\") " pod="openshift-marketplace/community-operators-g6lsz" Oct 11 03:55:56 crc kubenswrapper[4703]: I1011 03:55:56.894839 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-vmhjk" Oct 11 03:55:56 crc kubenswrapper[4703]: I1011 03:55:56.947519 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9301ceea-042d-4d8d-941c-54032cf80047-catalog-content\") pod \"certified-operators-z4mh8\" (UID: \"9301ceea-042d-4d8d-941c-54032cf80047\") " pod="openshift-marketplace/certified-operators-z4mh8" Oct 11 03:55:56 crc kubenswrapper[4703]: I1011 03:55:56.947963 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6z74n\" (UniqueName: \"kubernetes.io/projected/9301ceea-042d-4d8d-941c-54032cf80047-kube-api-access-6z74n\") pod \"certified-operators-z4mh8\" (UID: \"9301ceea-042d-4d8d-941c-54032cf80047\") " pod="openshift-marketplace/certified-operators-z4mh8" Oct 11 03:55:56 crc kubenswrapper[4703]: I1011 03:55:56.948014 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9301ceea-042d-4d8d-941c-54032cf80047-utilities\") pod \"certified-operators-z4mh8\" (UID: \"9301ceea-042d-4d8d-941c-54032cf80047\") " pod="openshift-marketplace/certified-operators-z4mh8" Oct 11 03:55:56 crc kubenswrapper[4703]: I1011 03:55:56.948048 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q44w9\" (UID: \"92095718-20ba-4b03-b949-e9a26009e283\") " pod="openshift-image-registry/image-registry-697d97f7c8-q44w9" Oct 11 03:55:56 crc kubenswrapper[4703]: I1011 03:55:56.948375 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9301ceea-042d-4d8d-941c-54032cf80047-catalog-content\") pod \"certified-operators-z4mh8\" (UID: \"9301ceea-042d-4d8d-941c-54032cf80047\") " pod="openshift-marketplace/certified-operators-z4mh8" Oct 11 03:55:56 crc kubenswrapper[4703]: E1011 03:55:56.948522 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-11 03:55:57.448506588 +0000 UTC m=+48.658988500 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q44w9" (UID: "92095718-20ba-4b03-b949-e9a26009e283") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 11 03:55:56 crc kubenswrapper[4703]: I1011 03:55:56.948905 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9301ceea-042d-4d8d-941c-54032cf80047-utilities\") pod \"certified-operators-z4mh8\" (UID: \"9301ceea-042d-4d8d-941c-54032cf80047\") " pod="openshift-marketplace/certified-operators-z4mh8" Oct 11 03:55:56 crc kubenswrapper[4703]: I1011 03:55:56.950337 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-69q9q\" (UniqueName: \"kubernetes.io/projected/27051689-7c44-4841-bcbc-7118a66ae0a0-kube-api-access-69q9q\") pod \"community-operators-g6lsz\" (UID: \"27051689-7c44-4841-bcbc-7118a66ae0a0\") " pod="openshift-marketplace/community-operators-g6lsz" Oct 11 03:55:56 crc kubenswrapper[4703]: I1011 03:55:56.982716 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-b5vqb" podStartSLOduration=20.982700167 podStartE2EDuration="20.982700167s" podCreationTimestamp="2025-10-11 03:55:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 03:55:56.93730112 +0000 UTC m=+48.147783052" watchObservedRunningTime="2025-10-11 03:55:56.982700167 +0000 UTC m=+48.193182089" Oct 11 03:55:56 crc kubenswrapper[4703]: I1011 03:55:56.984019 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-xxjnd"] Oct 11 03:55:56 crc kubenswrapper[4703]: I1011 03:55:56.985097 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xxjnd" Oct 11 03:55:57 crc kubenswrapper[4703]: I1011 03:55:57.050847 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 11 03:55:57 crc kubenswrapper[4703]: E1011 03:55:57.051174 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-11 03:55:57.551149927 +0000 UTC m=+48.761631849 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 11 03:55:57 crc kubenswrapper[4703]: I1011 03:55:57.079351 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xxjnd"] Oct 11 03:55:57 crc kubenswrapper[4703]: I1011 03:55:57.101664 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6z74n\" (UniqueName: \"kubernetes.io/projected/9301ceea-042d-4d8d-941c-54032cf80047-kube-api-access-6z74n\") pod \"certified-operators-z4mh8\" (UID: \"9301ceea-042d-4d8d-941c-54032cf80047\") " pod="openshift-marketplace/certified-operators-z4mh8" Oct 11 03:55:57 crc kubenswrapper[4703]: I1011 03:55:57.128481 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-z4mh8" Oct 11 03:55:57 crc kubenswrapper[4703]: I1011 03:55:57.152322 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m4rrv\" (UniqueName: \"kubernetes.io/projected/629e92e9-ec79-44b4-ab21-7b44609e75b3-kube-api-access-m4rrv\") pod \"community-operators-xxjnd\" (UID: \"629e92e9-ec79-44b4-ab21-7b44609e75b3\") " pod="openshift-marketplace/community-operators-xxjnd" Oct 11 03:55:57 crc kubenswrapper[4703]: I1011 03:55:57.152415 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q44w9\" (UID: \"92095718-20ba-4b03-b949-e9a26009e283\") " pod="openshift-image-registry/image-registry-697d97f7c8-q44w9" Oct 11 03:55:57 crc kubenswrapper[4703]: I1011 03:55:57.152442 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/629e92e9-ec79-44b4-ab21-7b44609e75b3-utilities\") pod \"community-operators-xxjnd\" (UID: \"629e92e9-ec79-44b4-ab21-7b44609e75b3\") " pod="openshift-marketplace/community-operators-xxjnd" Oct 11 03:55:57 crc kubenswrapper[4703]: I1011 03:55:57.152509 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/629e92e9-ec79-44b4-ab21-7b44609e75b3-catalog-content\") pod \"community-operators-xxjnd\" (UID: \"629e92e9-ec79-44b4-ab21-7b44609e75b3\") " pod="openshift-marketplace/community-operators-xxjnd" Oct 11 03:55:57 crc kubenswrapper[4703]: E1011 03:55:57.152870 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-11 03:55:57.6528542 +0000 UTC m=+48.863336122 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q44w9" (UID: "92095718-20ba-4b03-b949-e9a26009e283") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 11 03:55:57 crc kubenswrapper[4703]: I1011 03:55:57.183309 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-wm7q9"] Oct 11 03:55:57 crc kubenswrapper[4703]: I1011 03:55:57.185402 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-g6lsz" Oct 11 03:55:57 crc kubenswrapper[4703]: I1011 03:55:57.189712 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wm7q9" Oct 11 03:55:57 crc kubenswrapper[4703]: I1011 03:55:57.253855 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 11 03:55:57 crc kubenswrapper[4703]: I1011 03:55:57.254045 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/629e92e9-ec79-44b4-ab21-7b44609e75b3-catalog-content\") pod \"community-operators-xxjnd\" (UID: \"629e92e9-ec79-44b4-ab21-7b44609e75b3\") " pod="openshift-marketplace/community-operators-xxjnd" Oct 11 03:55:57 crc kubenswrapper[4703]: I1011 03:55:57.254130 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m4rrv\" (UniqueName: \"kubernetes.io/projected/629e92e9-ec79-44b4-ab21-7b44609e75b3-kube-api-access-m4rrv\") pod \"community-operators-xxjnd\" (UID: \"629e92e9-ec79-44b4-ab21-7b44609e75b3\") " pod="openshift-marketplace/community-operators-xxjnd" Oct 11 03:55:57 crc kubenswrapper[4703]: I1011 03:55:57.254180 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/629e92e9-ec79-44b4-ab21-7b44609e75b3-utilities\") pod \"community-operators-xxjnd\" (UID: \"629e92e9-ec79-44b4-ab21-7b44609e75b3\") " pod="openshift-marketplace/community-operators-xxjnd" Oct 11 03:55:57 crc kubenswrapper[4703]: I1011 03:55:57.254585 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/629e92e9-ec79-44b4-ab21-7b44609e75b3-utilities\") pod \"community-operators-xxjnd\" (UID: \"629e92e9-ec79-44b4-ab21-7b44609e75b3\") " pod="openshift-marketplace/community-operators-xxjnd" Oct 11 03:55:57 crc kubenswrapper[4703]: E1011 03:55:57.254653 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-11 03:55:57.754637416 +0000 UTC m=+48.965119338 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 11 03:55:57 crc kubenswrapper[4703]: I1011 03:55:57.254846 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/629e92e9-ec79-44b4-ab21-7b44609e75b3-catalog-content\") pod \"community-operators-xxjnd\" (UID: \"629e92e9-ec79-44b4-ab21-7b44609e75b3\") " pod="openshift-marketplace/community-operators-xxjnd" Oct 11 03:55:57 crc kubenswrapper[4703]: I1011 03:55:57.265064 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-2jlb9" podStartSLOduration=8.265045323 podStartE2EDuration="8.265045323s" podCreationTimestamp="2025-10-11 03:55:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 03:55:57.263911843 +0000 UTC m=+48.474393765" watchObservedRunningTime="2025-10-11 03:55:57.265045323 +0000 UTC m=+48.475527245" Oct 11 03:55:57 crc kubenswrapper[4703]: I1011 03:55:57.280817 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-wm7q9"] Oct 11 03:55:57 crc kubenswrapper[4703]: I1011 03:55:57.319568 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m4rrv\" (UniqueName: \"kubernetes.io/projected/629e92e9-ec79-44b4-ab21-7b44609e75b3-kube-api-access-m4rrv\") pod \"community-operators-xxjnd\" (UID: \"629e92e9-ec79-44b4-ab21-7b44609e75b3\") " pod="openshift-marketplace/community-operators-xxjnd" Oct 11 03:55:57 crc kubenswrapper[4703]: I1011 03:55:57.332341 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xxjnd" Oct 11 03:55:57 crc kubenswrapper[4703]: I1011 03:55:57.355160 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7dfc7ef5-c10c-4282-9857-ad5a2fb6e356-utilities\") pod \"certified-operators-wm7q9\" (UID: \"7dfc7ef5-c10c-4282-9857-ad5a2fb6e356\") " pod="openshift-marketplace/certified-operators-wm7q9" Oct 11 03:55:57 crc kubenswrapper[4703]: I1011 03:55:57.355227 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7dfc7ef5-c10c-4282-9857-ad5a2fb6e356-catalog-content\") pod \"certified-operators-wm7q9\" (UID: \"7dfc7ef5-c10c-4282-9857-ad5a2fb6e356\") " pod="openshift-marketplace/certified-operators-wm7q9" Oct 11 03:55:57 crc kubenswrapper[4703]: I1011 03:55:57.355274 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q44w9\" (UID: \"92095718-20ba-4b03-b949-e9a26009e283\") " pod="openshift-image-registry/image-registry-697d97f7c8-q44w9" Oct 11 03:55:57 crc kubenswrapper[4703]: I1011 03:55:57.355308 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lnblj\" (UniqueName: \"kubernetes.io/projected/7dfc7ef5-c10c-4282-9857-ad5a2fb6e356-kube-api-access-lnblj\") pod \"certified-operators-wm7q9\" (UID: \"7dfc7ef5-c10c-4282-9857-ad5a2fb6e356\") " pod="openshift-marketplace/certified-operators-wm7q9" Oct 11 03:55:57 crc kubenswrapper[4703]: E1011 03:55:57.355609 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-11 03:55:57.85559715 +0000 UTC m=+49.066079072 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q44w9" (UID: "92095718-20ba-4b03-b949-e9a26009e283") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 11 03:55:57 crc kubenswrapper[4703]: I1011 03:55:57.456034 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 11 03:55:57 crc kubenswrapper[4703]: I1011 03:55:57.456205 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lnblj\" (UniqueName: \"kubernetes.io/projected/7dfc7ef5-c10c-4282-9857-ad5a2fb6e356-kube-api-access-lnblj\") pod \"certified-operators-wm7q9\" (UID: \"7dfc7ef5-c10c-4282-9857-ad5a2fb6e356\") " pod="openshift-marketplace/certified-operators-wm7q9" Oct 11 03:55:57 crc kubenswrapper[4703]: I1011 03:55:57.456247 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7dfc7ef5-c10c-4282-9857-ad5a2fb6e356-utilities\") pod \"certified-operators-wm7q9\" (UID: \"7dfc7ef5-c10c-4282-9857-ad5a2fb6e356\") " pod="openshift-marketplace/certified-operators-wm7q9" Oct 11 03:55:57 crc kubenswrapper[4703]: I1011 03:55:57.456291 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7dfc7ef5-c10c-4282-9857-ad5a2fb6e356-catalog-content\") pod \"certified-operators-wm7q9\" (UID: \"7dfc7ef5-c10c-4282-9857-ad5a2fb6e356\") " pod="openshift-marketplace/certified-operators-wm7q9" Oct 11 03:55:57 crc kubenswrapper[4703]: I1011 03:55:57.456737 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7dfc7ef5-c10c-4282-9857-ad5a2fb6e356-catalog-content\") pod \"certified-operators-wm7q9\" (UID: \"7dfc7ef5-c10c-4282-9857-ad5a2fb6e356\") " pod="openshift-marketplace/certified-operators-wm7q9" Oct 11 03:55:57 crc kubenswrapper[4703]: E1011 03:55:57.456814 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-11 03:55:57.95679877 +0000 UTC m=+49.167280682 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 11 03:55:57 crc kubenswrapper[4703]: I1011 03:55:57.457230 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7dfc7ef5-c10c-4282-9857-ad5a2fb6e356-utilities\") pod \"certified-operators-wm7q9\" (UID: \"7dfc7ef5-c10c-4282-9857-ad5a2fb6e356\") " pod="openshift-marketplace/certified-operators-wm7q9" Oct 11 03:55:57 crc kubenswrapper[4703]: I1011 03:55:57.457619 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-8wrsq" podStartSLOduration=21.457603702 podStartE2EDuration="21.457603702s" podCreationTimestamp="2025-10-11 03:55:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 03:55:57.387772995 +0000 UTC m=+48.598254917" watchObservedRunningTime="2025-10-11 03:55:57.457603702 +0000 UTC m=+48.668085624" Oct 11 03:55:57 crc kubenswrapper[4703]: I1011 03:55:57.472627 4703 patch_prober.go:28] interesting pod/router-default-5444994796-mlb5q container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 11 03:55:57 crc kubenswrapper[4703]: [-]has-synced failed: reason withheld Oct 11 03:55:57 crc kubenswrapper[4703]: [+]process-running ok Oct 11 03:55:57 crc kubenswrapper[4703]: healthz check failed Oct 11 03:55:57 crc kubenswrapper[4703]: I1011 03:55:57.472985 4703 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-mlb5q" podUID="3516a340-0474-422a-bc3c-4424e08c17b0" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 11 03:55:57 crc kubenswrapper[4703]: I1011 03:55:57.513249 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lnblj\" (UniqueName: \"kubernetes.io/projected/7dfc7ef5-c10c-4282-9857-ad5a2fb6e356-kube-api-access-lnblj\") pod \"certified-operators-wm7q9\" (UID: \"7dfc7ef5-c10c-4282-9857-ad5a2fb6e356\") " pod="openshift-marketplace/certified-operators-wm7q9" Oct 11 03:55:57 crc kubenswrapper[4703]: I1011 03:55:57.533279 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wm7q9" Oct 11 03:55:57 crc kubenswrapper[4703]: I1011 03:55:57.558126 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q44w9\" (UID: \"92095718-20ba-4b03-b949-e9a26009e283\") " pod="openshift-image-registry/image-registry-697d97f7c8-q44w9" Oct 11 03:55:57 crc kubenswrapper[4703]: E1011 03:55:57.558535 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-11 03:55:58.058521635 +0000 UTC m=+49.269003557 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q44w9" (UID: "92095718-20ba-4b03-b949-e9a26009e283") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 11 03:55:57 crc kubenswrapper[4703]: I1011 03:55:57.660054 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 11 03:55:57 crc kubenswrapper[4703]: E1011 03:55:57.660507 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-11 03:55:58.160491536 +0000 UTC m=+49.370973458 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 11 03:55:57 crc kubenswrapper[4703]: I1011 03:55:57.691742 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-hrr65" event={"ID":"5938c6ea-4d1a-4ca7-bbc3-0bf2b1e9e6bb","Type":"ContainerStarted","Data":"cb60d5286a71efc74ef7317db6517b77fb89e1624a6ea0f14c274624617e6a05"} Oct 11 03:55:57 crc kubenswrapper[4703]: I1011 03:55:57.733363 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-h9vv2" event={"ID":"be037531-a087-4b3e-9327-f28d38fd3961","Type":"ContainerStarted","Data":"0e680a4475b6d6bcb8d4b8bdee26be2c0b7dc07ad80c94e9bcd4e30c5df0ea37"} Oct 11 03:55:57 crc kubenswrapper[4703]: I1011 03:55:57.764157 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q44w9\" (UID: \"92095718-20ba-4b03-b949-e9a26009e283\") " pod="openshift-image-registry/image-registry-697d97f7c8-q44w9" Oct 11 03:55:57 crc kubenswrapper[4703]: E1011 03:55:57.764624 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-11 03:55:58.264608423 +0000 UTC m=+49.475090345 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q44w9" (UID: "92095718-20ba-4b03-b949-e9a26009e283") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 11 03:55:57 crc kubenswrapper[4703]: I1011 03:55:57.776680 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-4s5kf" event={"ID":"8c8bf83f-ab19-43ea-b331-db24e4e97709","Type":"ContainerStarted","Data":"58dbd6379c0ae7049ad98b006c00a93f7a373a4b30fe5f360f6fb935026cbe34"} Oct 11 03:55:57 crc kubenswrapper[4703]: I1011 03:55:57.776730 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-4s5kf" event={"ID":"8c8bf83f-ab19-43ea-b331-db24e4e97709","Type":"ContainerStarted","Data":"b2d18a9ed77d1b1cb230e30bf636a03b33eff9b92147bb67d9270e558776c287"} Oct 11 03:55:57 crc kubenswrapper[4703]: I1011 03:55:57.813743 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-728m8" event={"ID":"59fc390f-24d6-4f16-855b-475f50c6f4b0","Type":"ContainerStarted","Data":"3f56245240db969d168b6e19007889fe6f131ec7c8216245ce31a73027d16b33"} Oct 11 03:55:57 crc kubenswrapper[4703]: I1011 03:55:57.861412 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-pb92d" event={"ID":"9b135157-3c40-4f85-84a4-5c521ccbf1bd","Type":"ContainerStarted","Data":"7054519ec1506c225666d1a2f1217f96afc3484df299eaf3e3e7454bb433037f"} Oct 11 03:55:57 crc kubenswrapper[4703]: I1011 03:55:57.861508 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-pb92d" event={"ID":"9b135157-3c40-4f85-84a4-5c521ccbf1bd","Type":"ContainerStarted","Data":"75d86f49360fd9d2521854bd4c2a6c7ab8560f41c26c347e2903a51e361e9fd9"} Oct 11 03:55:57 crc kubenswrapper[4703]: I1011 03:55:57.865451 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 11 03:55:57 crc kubenswrapper[4703]: E1011 03:55:57.866695 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-11 03:55:58.366663506 +0000 UTC m=+49.577145428 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 11 03:55:57 crc kubenswrapper[4703]: I1011 03:55:57.908756 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-4s5kf" podStartSLOduration=22.908740195 podStartE2EDuration="22.908740195s" podCreationTimestamp="2025-10-11 03:55:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 03:55:57.804706039 +0000 UTC m=+49.015187951" watchObservedRunningTime="2025-10-11 03:55:57.908740195 +0000 UTC m=+49.119222117" Oct 11 03:55:57 crc kubenswrapper[4703]: I1011 03:55:57.909051 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4fv46" event={"ID":"cffc1277-9912-47dd-80a3-d724149e420c","Type":"ContainerStarted","Data":"f933aaedee85eb8f6ad14b122d845f87633ead7116829e5bfee4d6eb8ad694d4"} Oct 11 03:55:57 crc kubenswrapper[4703]: I1011 03:55:57.944979 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-rf7jf" event={"ID":"5b7cae24-659a-4bda-b925-d74698801dd9","Type":"ContainerStarted","Data":"1ec942928a48a5b2f446d7f664397d96b79d6e10fc5645a46c0096d8aa09bbc0"} Oct 11 03:55:57 crc kubenswrapper[4703]: I1011 03:55:57.945742 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-rf7jf" Oct 11 03:55:57 crc kubenswrapper[4703]: I1011 03:55:57.960164 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-pb92d" podStartSLOduration=22.960145341 podStartE2EDuration="22.960145341s" podCreationTimestamp="2025-10-11 03:55:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 03:55:57.910146182 +0000 UTC m=+49.120628104" watchObservedRunningTime="2025-10-11 03:55:57.960145341 +0000 UTC m=+49.170627263" Oct 11 03:55:57 crc kubenswrapper[4703]: I1011 03:55:57.962640 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4fv46" podStartSLOduration=21.962632498 podStartE2EDuration="21.962632498s" podCreationTimestamp="2025-10-11 03:55:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 03:55:57.961000275 +0000 UTC m=+49.171482197" watchObservedRunningTime="2025-10-11 03:55:57.962632498 +0000 UTC m=+49.173114420" Oct 11 03:55:57 crc kubenswrapper[4703]: I1011 03:55:57.967056 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q44w9\" (UID: \"92095718-20ba-4b03-b949-e9a26009e283\") " pod="openshift-image-registry/image-registry-697d97f7c8-q44w9" Oct 11 03:55:57 crc kubenswrapper[4703]: I1011 03:55:57.989204 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-4v67m" event={"ID":"34d64762-444d-41a0-bf17-d8de6712a90f","Type":"ContainerStarted","Data":"8287279b7d1f04a5ca2edee378d9fa7ccaeff2b40546283ccd0579eb69e21a47"} Oct 11 03:55:57 crc kubenswrapper[4703]: I1011 03:55:57.989811 4703 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-tjd96 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.43:8080/healthz\": dial tcp 10.217.0.43:8080: connect: connection refused" start-of-body= Oct 11 03:55:57 crc kubenswrapper[4703]: I1011 03:55:57.989872 4703 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-tjd96" podUID="c7664ffa-e88d-468b-b43d-3b3b6ec5195b" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.43:8080/healthz\": dial tcp 10.217.0.43:8080: connect: connection refused" Oct 11 03:55:57 crc kubenswrapper[4703]: E1011 03:55:57.989976 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-11 03:55:58.489958604 +0000 UTC m=+49.700440526 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q44w9" (UID: "92095718-20ba-4b03-b949-e9a26009e283") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 11 03:55:58 crc kubenswrapper[4703]: I1011 03:55:58.009402 4703 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-multus/cni-sysctl-allowlist-ds-69n87" podUID="01618421-523c-44ab-b631-f2624ba8ab2d" containerName="kube-multus-additional-cni-plugins" containerID="cri-o://a40d99f678ccb88a8e3c7876ce75ddc6fa397760b26ff87effa4a1ed05d7f35b" gracePeriod=30 Oct 11 03:55:58 crc kubenswrapper[4703]: I1011 03:55:58.086622 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 11 03:55:58 crc kubenswrapper[4703]: E1011 03:55:58.088980 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-11 03:55:58.588963886 +0000 UTC m=+49.799445808 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 11 03:55:58 crc kubenswrapper[4703]: I1011 03:55:58.096659 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q44w9\" (UID: \"92095718-20ba-4b03-b949-e9a26009e283\") " pod="openshift-image-registry/image-registry-697d97f7c8-q44w9" Oct 11 03:55:58 crc kubenswrapper[4703]: E1011 03:55:58.103514 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-11 03:55:58.603500102 +0000 UTC m=+49.813982024 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q44w9" (UID: "92095718-20ba-4b03-b949-e9a26009e283") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 11 03:55:58 crc kubenswrapper[4703]: I1011 03:55:58.137399 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-4v67m" podStartSLOduration=22.137380054 podStartE2EDuration="22.137380054s" podCreationTimestamp="2025-10-11 03:55:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 03:55:58.135882343 +0000 UTC m=+49.346364265" watchObservedRunningTime="2025-10-11 03:55:58.137380054 +0000 UTC m=+49.347861976" Oct 11 03:55:58 crc kubenswrapper[4703]: I1011 03:55:58.138697 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-rf7jf" podStartSLOduration=9.138691277 podStartE2EDuration="9.138691277s" podCreationTimestamp="2025-10-11 03:55:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 03:55:58.021671257 +0000 UTC m=+49.232153169" watchObservedRunningTime="2025-10-11 03:55:58.138691277 +0000 UTC m=+49.349173189" Oct 11 03:55:58 crc kubenswrapper[4703]: I1011 03:55:58.186333 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-m4jc5" Oct 11 03:55:58 crc kubenswrapper[4703]: I1011 03:55:58.186488 4703 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 11 03:55:58 crc kubenswrapper[4703]: I1011 03:55:58.201723 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 11 03:55:58 crc kubenswrapper[4703]: E1011 03:55:58.202305 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-11 03:55:58.702289329 +0000 UTC m=+49.912771251 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 11 03:55:58 crc kubenswrapper[4703]: I1011 03:55:58.204060 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-z4mh8"] Oct 11 03:55:58 crc kubenswrapper[4703]: I1011 03:55:58.225755 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-m4jc5" Oct 11 03:55:58 crc kubenswrapper[4703]: I1011 03:55:58.278325 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xxjnd"] Oct 11 03:55:58 crc kubenswrapper[4703]: I1011 03:55:58.305936 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q44w9\" (UID: \"92095718-20ba-4b03-b949-e9a26009e283\") " pod="openshift-image-registry/image-registry-697d97f7c8-q44w9" Oct 11 03:55:58 crc kubenswrapper[4703]: E1011 03:55:58.306291 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-11 03:55:58.806280463 +0000 UTC m=+50.016762385 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q44w9" (UID: "92095718-20ba-4b03-b949-e9a26009e283") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 11 03:55:58 crc kubenswrapper[4703]: I1011 03:55:58.332334 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-g6lsz"] Oct 11 03:55:58 crc kubenswrapper[4703]: W1011 03:55:58.379771 4703 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod27051689_7c44_4841_bcbc_7118a66ae0a0.slice/crio-3d332cac8f64275427f261d7221179b27e911f573168660eebb57a54980025a7 WatchSource:0}: Error finding container 3d332cac8f64275427f261d7221179b27e911f573168660eebb57a54980025a7: Status 404 returned error can't find the container with id 3d332cac8f64275427f261d7221179b27e911f573168660eebb57a54980025a7 Oct 11 03:55:58 crc kubenswrapper[4703]: I1011 03:55:58.408289 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 11 03:55:58 crc kubenswrapper[4703]: I1011 03:55:58.408604 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-wm7q9"] Oct 11 03:55:58 crc kubenswrapper[4703]: E1011 03:55:58.408865 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-11 03:55:58.908849269 +0000 UTC m=+50.119331191 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 11 03:55:58 crc kubenswrapper[4703]: I1011 03:55:58.459338 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-bz6vn" Oct 11 03:55:58 crc kubenswrapper[4703]: I1011 03:55:58.463682 4703 patch_prober.go:28] interesting pod/router-default-5444994796-mlb5q container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 11 03:55:58 crc kubenswrapper[4703]: [-]has-synced failed: reason withheld Oct 11 03:55:58 crc kubenswrapper[4703]: [+]process-running ok Oct 11 03:55:58 crc kubenswrapper[4703]: healthz check failed Oct 11 03:55:58 crc kubenswrapper[4703]: I1011 03:55:58.463774 4703 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-mlb5q" podUID="3516a340-0474-422a-bc3c-4424e08c17b0" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 11 03:55:58 crc kubenswrapper[4703]: I1011 03:55:58.511061 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q44w9\" (UID: \"92095718-20ba-4b03-b949-e9a26009e283\") " pod="openshift-image-registry/image-registry-697d97f7c8-q44w9" Oct 11 03:55:58 crc kubenswrapper[4703]: E1011 03:55:58.511389 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-11 03:55:59.011379116 +0000 UTC m=+50.221861038 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q44w9" (UID: "92095718-20ba-4b03-b949-e9a26009e283") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 11 03:55:58 crc kubenswrapper[4703]: I1011 03:55:58.612273 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 11 03:55:58 crc kubenswrapper[4703]: E1011 03:55:58.613188 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-11 03:55:59.113170581 +0000 UTC m=+50.323652503 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 11 03:55:58 crc kubenswrapper[4703]: I1011 03:55:58.679394 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vmsw5" Oct 11 03:55:58 crc kubenswrapper[4703]: I1011 03:55:58.714690 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q44w9\" (UID: \"92095718-20ba-4b03-b949-e9a26009e283\") " pod="openshift-image-registry/image-registry-697d97f7c8-q44w9" Oct 11 03:55:58 crc kubenswrapper[4703]: E1011 03:55:58.715035 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-11 03:55:59.215023849 +0000 UTC m=+50.425505771 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q44w9" (UID: "92095718-20ba-4b03-b949-e9a26009e283") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 11 03:55:58 crc kubenswrapper[4703]: I1011 03:55:58.816025 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 11 03:55:58 crc kubenswrapper[4703]: E1011 03:55:58.816307 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-11 03:55:59.31628234 +0000 UTC m=+50.526764262 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 11 03:55:58 crc kubenswrapper[4703]: I1011 03:55:58.917848 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q44w9\" (UID: \"92095718-20ba-4b03-b949-e9a26009e283\") " pod="openshift-image-registry/image-registry-697d97f7c8-q44w9" Oct 11 03:55:58 crc kubenswrapper[4703]: E1011 03:55:58.918259 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-11 03:55:59.418240431 +0000 UTC m=+50.628722423 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q44w9" (UID: "92095718-20ba-4b03-b949-e9a26009e283") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 11 03:55:58 crc kubenswrapper[4703]: I1011 03:55:58.926712 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-qnqtn"] Oct 11 03:55:58 crc kubenswrapper[4703]: I1011 03:55:58.927653 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qnqtn" Oct 11 03:55:58 crc kubenswrapper[4703]: I1011 03:55:58.930799 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Oct 11 03:55:58 crc kubenswrapper[4703]: I1011 03:55:58.938073 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-qnqtn"] Oct 11 03:55:58 crc kubenswrapper[4703]: I1011 03:55:58.975195 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Oct 11 03:55:58 crc kubenswrapper[4703]: I1011 03:55:58.975833 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 11 03:55:58 crc kubenswrapper[4703]: I1011 03:55:58.983722 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Oct 11 03:55:58 crc kubenswrapper[4703]: I1011 03:55:58.984284 4703 generic.go:334] "Generic (PLEG): container finished" podID="27051689-7c44-4841-bcbc-7118a66ae0a0" containerID="2570a196676290333ca5f8845814cc698f4b05ea074208f037e9c1e1c5a60f5a" exitCode=0 Oct 11 03:55:58 crc kubenswrapper[4703]: I1011 03:55:58.985205 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-g6lsz" event={"ID":"27051689-7c44-4841-bcbc-7118a66ae0a0","Type":"ContainerDied","Data":"2570a196676290333ca5f8845814cc698f4b05ea074208f037e9c1e1c5a60f5a"} Oct 11 03:55:58 crc kubenswrapper[4703]: I1011 03:55:58.985236 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-g6lsz" event={"ID":"27051689-7c44-4841-bcbc-7118a66ae0a0","Type":"ContainerStarted","Data":"3d332cac8f64275427f261d7221179b27e911f573168660eebb57a54980025a7"} Oct 11 03:55:58 crc kubenswrapper[4703]: I1011 03:55:58.987974 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Oct 11 03:55:58 crc kubenswrapper[4703]: I1011 03:55:58.988613 4703 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 11 03:55:59 crc kubenswrapper[4703]: I1011 03:55:59.000713 4703 generic.go:334] "Generic (PLEG): container finished" podID="9301ceea-042d-4d8d-941c-54032cf80047" containerID="b15f15e4c0b3ad19a7ca6242c3c99e67a98b48a9e68c55c5bd344a5c92c50642" exitCode=0 Oct 11 03:55:59 crc kubenswrapper[4703]: I1011 03:55:59.000807 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-z4mh8" event={"ID":"9301ceea-042d-4d8d-941c-54032cf80047","Type":"ContainerDied","Data":"b15f15e4c0b3ad19a7ca6242c3c99e67a98b48a9e68c55c5bd344a5c92c50642"} Oct 11 03:55:59 crc kubenswrapper[4703]: I1011 03:55:59.001048 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-z4mh8" event={"ID":"9301ceea-042d-4d8d-941c-54032cf80047","Type":"ContainerStarted","Data":"0f28f57e7dd1a444ac576b47c3ed1c53ba3015f4b48f8487a9951ddc0db1e252"} Oct 11 03:55:59 crc kubenswrapper[4703]: I1011 03:55:59.001860 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Oct 11 03:55:59 crc kubenswrapper[4703]: I1011 03:55:59.009481 4703 generic.go:334] "Generic (PLEG): container finished" podID="629e92e9-ec79-44b4-ab21-7b44609e75b3" containerID="aac92bdff7b7700a95e190c20008ea30efcd1030892d0aa709dd83d7cd99ba2a" exitCode=0 Oct 11 03:55:59 crc kubenswrapper[4703]: I1011 03:55:59.009528 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xxjnd" event={"ID":"629e92e9-ec79-44b4-ab21-7b44609e75b3","Type":"ContainerDied","Data":"aac92bdff7b7700a95e190c20008ea30efcd1030892d0aa709dd83d7cd99ba2a"} Oct 11 03:55:59 crc kubenswrapper[4703]: I1011 03:55:59.009562 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xxjnd" event={"ID":"629e92e9-ec79-44b4-ab21-7b44609e75b3","Type":"ContainerStarted","Data":"ec62bf50d68ec1b29268cd2d88307bdd59083339b1c0586ff77eee146f9b74f3"} Oct 11 03:55:59 crc kubenswrapper[4703]: I1011 03:55:59.020284 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 11 03:55:59 crc kubenswrapper[4703]: E1011 03:55:59.020717 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-11 03:55:59.520689435 +0000 UTC m=+50.731171357 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 11 03:55:59 crc kubenswrapper[4703]: I1011 03:55:59.020859 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9307554f-a116-4c31-9585-c712558479fc-catalog-content\") pod \"redhat-marketplace-qnqtn\" (UID: \"9307554f-a116-4c31-9585-c712558479fc\") " pod="openshift-marketplace/redhat-marketplace-qnqtn" Oct 11 03:55:59 crc kubenswrapper[4703]: I1011 03:55:59.020953 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/86958692-3361-419f-87ef-9b4fe71f9d62-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"86958692-3361-419f-87ef-9b4fe71f9d62\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 11 03:55:59 crc kubenswrapper[4703]: I1011 03:55:59.021014 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/86958692-3361-419f-87ef-9b4fe71f9d62-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"86958692-3361-419f-87ef-9b4fe71f9d62\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 11 03:55:59 crc kubenswrapper[4703]: I1011 03:55:59.021121 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q44w9\" (UID: \"92095718-20ba-4b03-b949-e9a26009e283\") " pod="openshift-image-registry/image-registry-697d97f7c8-q44w9" Oct 11 03:55:59 crc kubenswrapper[4703]: I1011 03:55:59.021191 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w5nx5\" (UniqueName: \"kubernetes.io/projected/9307554f-a116-4c31-9585-c712558479fc-kube-api-access-w5nx5\") pod \"redhat-marketplace-qnqtn\" (UID: \"9307554f-a116-4c31-9585-c712558479fc\") " pod="openshift-marketplace/redhat-marketplace-qnqtn" Oct 11 03:55:59 crc kubenswrapper[4703]: I1011 03:55:59.021246 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9307554f-a116-4c31-9585-c712558479fc-utilities\") pod \"redhat-marketplace-qnqtn\" (UID: \"9307554f-a116-4c31-9585-c712558479fc\") " pod="openshift-marketplace/redhat-marketplace-qnqtn" Oct 11 03:55:59 crc kubenswrapper[4703]: E1011 03:55:59.022390 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-11 03:55:59.52238258 +0000 UTC m=+50.732864502 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q44w9" (UID: "92095718-20ba-4b03-b949-e9a26009e283") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 11 03:55:59 crc kubenswrapper[4703]: I1011 03:55:59.036757 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-728m8" event={"ID":"59fc390f-24d6-4f16-855b-475f50c6f4b0","Type":"ContainerStarted","Data":"cb7a123c46d7673501f22f702c32c0f535c2925fc816c655e20d41be3548b1b0"} Oct 11 03:55:59 crc kubenswrapper[4703]: I1011 03:55:59.074910 4703 generic.go:334] "Generic (PLEG): container finished" podID="7dfc7ef5-c10c-4282-9857-ad5a2fb6e356" containerID="d3081d161fc7e1f0c09fb4fe84c8c0112c74db62458d1555cdb940fe95b27aa1" exitCode=0 Oct 11 03:55:59 crc kubenswrapper[4703]: I1011 03:55:59.075985 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wm7q9" event={"ID":"7dfc7ef5-c10c-4282-9857-ad5a2fb6e356","Type":"ContainerDied","Data":"d3081d161fc7e1f0c09fb4fe84c8c0112c74db62458d1555cdb940fe95b27aa1"} Oct 11 03:55:59 crc kubenswrapper[4703]: I1011 03:55:59.076022 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wm7q9" event={"ID":"7dfc7ef5-c10c-4282-9857-ad5a2fb6e356","Type":"ContainerStarted","Data":"74d26f6e65937ff776b7ffd464f56be7a95c0accab4d0fc032d4d5a1a23d5fbd"} Oct 11 03:55:59 crc kubenswrapper[4703]: I1011 03:55:59.127096 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 11 03:55:59 crc kubenswrapper[4703]: E1011 03:55:59.127559 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-11 03:55:59.627540256 +0000 UTC m=+50.838022178 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 11 03:55:59 crc kubenswrapper[4703]: I1011 03:55:59.127862 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9307554f-a116-4c31-9585-c712558479fc-catalog-content\") pod \"redhat-marketplace-qnqtn\" (UID: \"9307554f-a116-4c31-9585-c712558479fc\") " pod="openshift-marketplace/redhat-marketplace-qnqtn" Oct 11 03:55:59 crc kubenswrapper[4703]: I1011 03:55:59.127968 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/86958692-3361-419f-87ef-9b4fe71f9d62-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"86958692-3361-419f-87ef-9b4fe71f9d62\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 11 03:55:59 crc kubenswrapper[4703]: I1011 03:55:59.128002 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/86958692-3361-419f-87ef-9b4fe71f9d62-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"86958692-3361-419f-87ef-9b4fe71f9d62\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 11 03:55:59 crc kubenswrapper[4703]: I1011 03:55:59.128237 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q44w9\" (UID: \"92095718-20ba-4b03-b949-e9a26009e283\") " pod="openshift-image-registry/image-registry-697d97f7c8-q44w9" Oct 11 03:55:59 crc kubenswrapper[4703]: I1011 03:55:59.128314 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w5nx5\" (UniqueName: \"kubernetes.io/projected/9307554f-a116-4c31-9585-c712558479fc-kube-api-access-w5nx5\") pod \"redhat-marketplace-qnqtn\" (UID: \"9307554f-a116-4c31-9585-c712558479fc\") " pod="openshift-marketplace/redhat-marketplace-qnqtn" Oct 11 03:55:59 crc kubenswrapper[4703]: I1011 03:55:59.128387 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9307554f-a116-4c31-9585-c712558479fc-utilities\") pod \"redhat-marketplace-qnqtn\" (UID: \"9307554f-a116-4c31-9585-c712558479fc\") " pod="openshift-marketplace/redhat-marketplace-qnqtn" Oct 11 03:55:59 crc kubenswrapper[4703]: I1011 03:55:59.133864 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9307554f-a116-4c31-9585-c712558479fc-catalog-content\") pod \"redhat-marketplace-qnqtn\" (UID: \"9307554f-a116-4c31-9585-c712558479fc\") " pod="openshift-marketplace/redhat-marketplace-qnqtn" Oct 11 03:55:59 crc kubenswrapper[4703]: E1011 03:55:59.134332 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-11 03:55:59.634319896 +0000 UTC m=+50.844801818 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q44w9" (UID: "92095718-20ba-4b03-b949-e9a26009e283") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 11 03:55:59 crc kubenswrapper[4703]: I1011 03:55:59.135857 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/86958692-3361-419f-87ef-9b4fe71f9d62-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"86958692-3361-419f-87ef-9b4fe71f9d62\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 11 03:55:59 crc kubenswrapper[4703]: I1011 03:55:59.136827 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9307554f-a116-4c31-9585-c712558479fc-utilities\") pod \"redhat-marketplace-qnqtn\" (UID: \"9307554f-a116-4c31-9585-c712558479fc\") " pod="openshift-marketplace/redhat-marketplace-qnqtn" Oct 11 03:55:59 crc kubenswrapper[4703]: I1011 03:55:59.172003 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w5nx5\" (UniqueName: \"kubernetes.io/projected/9307554f-a116-4c31-9585-c712558479fc-kube-api-access-w5nx5\") pod \"redhat-marketplace-qnqtn\" (UID: \"9307554f-a116-4c31-9585-c712558479fc\") " pod="openshift-marketplace/redhat-marketplace-qnqtn" Oct 11 03:55:59 crc kubenswrapper[4703]: I1011 03:55:59.181437 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/86958692-3361-419f-87ef-9b4fe71f9d62-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"86958692-3361-419f-87ef-9b4fe71f9d62\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 11 03:55:59 crc kubenswrapper[4703]: I1011 03:55:59.229768 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 11 03:55:59 crc kubenswrapper[4703]: E1011 03:55:59.230167 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-11 03:55:59.730151323 +0000 UTC m=+50.940633245 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 11 03:55:59 crc kubenswrapper[4703]: I1011 03:55:59.245159 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qnqtn" Oct 11 03:55:59 crc kubenswrapper[4703]: I1011 03:55:59.297688 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 11 03:55:59 crc kubenswrapper[4703]: I1011 03:55:59.318102 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-vvrdk"] Oct 11 03:55:59 crc kubenswrapper[4703]: I1011 03:55:59.319170 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vvrdk" Oct 11 03:55:59 crc kubenswrapper[4703]: I1011 03:55:59.328588 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-vvrdk"] Oct 11 03:55:59 crc kubenswrapper[4703]: I1011 03:55:59.333160 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q44w9\" (UID: \"92095718-20ba-4b03-b949-e9a26009e283\") " pod="openshift-image-registry/image-registry-697d97f7c8-q44w9" Oct 11 03:55:59 crc kubenswrapper[4703]: E1011 03:55:59.333512 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-11 03:55:59.833499731 +0000 UTC m=+51.043981653 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q44w9" (UID: "92095718-20ba-4b03-b949-e9a26009e283") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 11 03:55:59 crc kubenswrapper[4703]: I1011 03:55:59.336617 4703 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Oct 11 03:55:59 crc kubenswrapper[4703]: I1011 03:55:59.434109 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 11 03:55:59 crc kubenswrapper[4703]: I1011 03:55:59.434312 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vr2n2\" (UniqueName: \"kubernetes.io/projected/fd430f8d-f9b2-4b4e-9966-5fd12e03e0df-kube-api-access-vr2n2\") pod \"redhat-marketplace-vvrdk\" (UID: \"fd430f8d-f9b2-4b4e-9966-5fd12e03e0df\") " pod="openshift-marketplace/redhat-marketplace-vvrdk" Oct 11 03:55:59 crc kubenswrapper[4703]: I1011 03:55:59.434378 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fd430f8d-f9b2-4b4e-9966-5fd12e03e0df-catalog-content\") pod \"redhat-marketplace-vvrdk\" (UID: \"fd430f8d-f9b2-4b4e-9966-5fd12e03e0df\") " pod="openshift-marketplace/redhat-marketplace-vvrdk" Oct 11 03:55:59 crc kubenswrapper[4703]: I1011 03:55:59.434412 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fd430f8d-f9b2-4b4e-9966-5fd12e03e0df-utilities\") pod \"redhat-marketplace-vvrdk\" (UID: \"fd430f8d-f9b2-4b4e-9966-5fd12e03e0df\") " pod="openshift-marketplace/redhat-marketplace-vvrdk" Oct 11 03:55:59 crc kubenswrapper[4703]: E1011 03:55:59.435813 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-11 03:55:59.93577916 +0000 UTC m=+51.146261082 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 11 03:55:59 crc kubenswrapper[4703]: I1011 03:55:59.477314 4703 patch_prober.go:28] interesting pod/router-default-5444994796-mlb5q container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 11 03:55:59 crc kubenswrapper[4703]: [-]has-synced failed: reason withheld Oct 11 03:55:59 crc kubenswrapper[4703]: [+]process-running ok Oct 11 03:55:59 crc kubenswrapper[4703]: healthz check failed Oct 11 03:55:59 crc kubenswrapper[4703]: I1011 03:55:59.477370 4703 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-mlb5q" podUID="3516a340-0474-422a-bc3c-4424e08c17b0" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 11 03:55:59 crc kubenswrapper[4703]: I1011 03:55:59.535660 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vr2n2\" (UniqueName: \"kubernetes.io/projected/fd430f8d-f9b2-4b4e-9966-5fd12e03e0df-kube-api-access-vr2n2\") pod \"redhat-marketplace-vvrdk\" (UID: \"fd430f8d-f9b2-4b4e-9966-5fd12e03e0df\") " pod="openshift-marketplace/redhat-marketplace-vvrdk" Oct 11 03:55:59 crc kubenswrapper[4703]: I1011 03:55:59.536102 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fd430f8d-f9b2-4b4e-9966-5fd12e03e0df-catalog-content\") pod \"redhat-marketplace-vvrdk\" (UID: \"fd430f8d-f9b2-4b4e-9966-5fd12e03e0df\") " pod="openshift-marketplace/redhat-marketplace-vvrdk" Oct 11 03:55:59 crc kubenswrapper[4703]: I1011 03:55:59.536142 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fd430f8d-f9b2-4b4e-9966-5fd12e03e0df-utilities\") pod \"redhat-marketplace-vvrdk\" (UID: \"fd430f8d-f9b2-4b4e-9966-5fd12e03e0df\") " pod="openshift-marketplace/redhat-marketplace-vvrdk" Oct 11 03:55:59 crc kubenswrapper[4703]: I1011 03:55:59.536170 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q44w9\" (UID: \"92095718-20ba-4b03-b949-e9a26009e283\") " pod="openshift-image-registry/image-registry-697d97f7c8-q44w9" Oct 11 03:55:59 crc kubenswrapper[4703]: E1011 03:55:59.536428 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-11 03:56:00.036417125 +0000 UTC m=+51.246899047 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q44w9" (UID: "92095718-20ba-4b03-b949-e9a26009e283") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 11 03:55:59 crc kubenswrapper[4703]: I1011 03:55:59.537105 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fd430f8d-f9b2-4b4e-9966-5fd12e03e0df-catalog-content\") pod \"redhat-marketplace-vvrdk\" (UID: \"fd430f8d-f9b2-4b4e-9966-5fd12e03e0df\") " pod="openshift-marketplace/redhat-marketplace-vvrdk" Oct 11 03:55:59 crc kubenswrapper[4703]: I1011 03:55:59.537331 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fd430f8d-f9b2-4b4e-9966-5fd12e03e0df-utilities\") pod \"redhat-marketplace-vvrdk\" (UID: \"fd430f8d-f9b2-4b4e-9966-5fd12e03e0df\") " pod="openshift-marketplace/redhat-marketplace-vvrdk" Oct 11 03:55:59 crc kubenswrapper[4703]: I1011 03:55:59.578447 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vr2n2\" (UniqueName: \"kubernetes.io/projected/fd430f8d-f9b2-4b4e-9966-5fd12e03e0df-kube-api-access-vr2n2\") pod \"redhat-marketplace-vvrdk\" (UID: \"fd430f8d-f9b2-4b4e-9966-5fd12e03e0df\") " pod="openshift-marketplace/redhat-marketplace-vvrdk" Oct 11 03:55:59 crc kubenswrapper[4703]: I1011 03:55:59.589695 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-qnqtn"] Oct 11 03:55:59 crc kubenswrapper[4703]: I1011 03:55:59.638351 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 11 03:55:59 crc kubenswrapper[4703]: E1011 03:55:59.638680 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-11 03:56:00.138655733 +0000 UTC m=+51.349137655 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 11 03:55:59 crc kubenswrapper[4703]: I1011 03:55:59.640341 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q44w9\" (UID: \"92095718-20ba-4b03-b949-e9a26009e283\") " pod="openshift-image-registry/image-registry-697d97f7c8-q44w9" Oct 11 03:55:59 crc kubenswrapper[4703]: E1011 03:55:59.640998 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-11 03:56:00.140990074 +0000 UTC m=+51.351471996 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q44w9" (UID: "92095718-20ba-4b03-b949-e9a26009e283") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 11 03:55:59 crc kubenswrapper[4703]: I1011 03:55:59.641248 4703 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2025-10-11T03:55:59.33686948Z","Handler":null,"Name":""} Oct 11 03:55:59 crc kubenswrapper[4703]: I1011 03:55:59.647084 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vvrdk" Oct 11 03:55:59 crc kubenswrapper[4703]: I1011 03:55:59.647642 4703 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Oct 11 03:55:59 crc kubenswrapper[4703]: I1011 03:55:59.647688 4703 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Oct 11 03:55:59 crc kubenswrapper[4703]: I1011 03:55:59.652147 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Oct 11 03:55:59 crc kubenswrapper[4703]: W1011 03:55:59.676040 4703 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod86958692_3361_419f_87ef_9b4fe71f9d62.slice/crio-0533be961534cfd0692c5caa45eb3bf6fb322a70a598d7385800488e30135541 WatchSource:0}: Error finding container 0533be961534cfd0692c5caa45eb3bf6fb322a70a598d7385800488e30135541: Status 404 returned error can't find the container with id 0533be961534cfd0692c5caa45eb3bf6fb322a70a598d7385800488e30135541 Oct 11 03:55:59 crc kubenswrapper[4703]: I1011 03:55:59.727666 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-n5sn7"] Oct 11 03:55:59 crc kubenswrapper[4703]: I1011 03:55:59.733769 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n5sn7" Oct 11 03:55:59 crc kubenswrapper[4703]: I1011 03:55:59.742358 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 11 03:55:59 crc kubenswrapper[4703]: I1011 03:55:59.743219 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Oct 11 03:55:59 crc kubenswrapper[4703]: I1011 03:55:59.750912 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-n5sn7"] Oct 11 03:55:59 crc kubenswrapper[4703]: I1011 03:55:59.752665 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Oct 11 03:55:59 crc kubenswrapper[4703]: I1011 03:55:59.847572 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jwqgx\" (UniqueName: \"kubernetes.io/projected/d85389b5-5e4e-4939-8f96-902c5a3ed9e2-kube-api-access-jwqgx\") pod \"redhat-operators-n5sn7\" (UID: \"d85389b5-5e4e-4939-8f96-902c5a3ed9e2\") " pod="openshift-marketplace/redhat-operators-n5sn7" Oct 11 03:55:59 crc kubenswrapper[4703]: I1011 03:55:59.881267 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d85389b5-5e4e-4939-8f96-902c5a3ed9e2-catalog-content\") pod \"redhat-operators-n5sn7\" (UID: \"d85389b5-5e4e-4939-8f96-902c5a3ed9e2\") " pod="openshift-marketplace/redhat-operators-n5sn7" Oct 11 03:55:59 crc kubenswrapper[4703]: I1011 03:55:59.881379 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d85389b5-5e4e-4939-8f96-902c5a3ed9e2-utilities\") pod \"redhat-operators-n5sn7\" (UID: \"d85389b5-5e4e-4939-8f96-902c5a3ed9e2\") " pod="openshift-marketplace/redhat-operators-n5sn7" Oct 11 03:55:59 crc kubenswrapper[4703]: I1011 03:55:59.882055 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q44w9\" (UID: \"92095718-20ba-4b03-b949-e9a26009e283\") " pod="openshift-image-registry/image-registry-697d97f7c8-q44w9" Oct 11 03:55:59 crc kubenswrapper[4703]: I1011 03:55:59.900007 4703 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 11 03:55:59 crc kubenswrapper[4703]: I1011 03:55:59.900056 4703 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q44w9\" (UID: \"92095718-20ba-4b03-b949-e9a26009e283\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-q44w9" Oct 11 03:55:59 crc kubenswrapper[4703]: I1011 03:55:59.937270 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-v74rb"] Oct 11 03:55:59 crc kubenswrapper[4703]: I1011 03:55:59.938817 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-v74rb" Oct 11 03:55:59 crc kubenswrapper[4703]: E1011 03:55:59.939050 4703 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9307554f_a116_4c31_9585_c712558479fc.slice/crio-1cd0e4c1ea7ee1bee94aa85f64857d65d2ad9699bb41abd3f98995c2bc634241.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9307554f_a116_4c31_9585_c712558479fc.slice/crio-conmon-1cd0e4c1ea7ee1bee94aa85f64857d65d2ad9699bb41abd3f98995c2bc634241.scope\": RecentStats: unable to find data in memory cache]" Oct 11 03:55:59 crc kubenswrapper[4703]: I1011 03:55:59.947811 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-v74rb"] Oct 11 03:55:59 crc kubenswrapper[4703]: I1011 03:55:59.985629 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d85389b5-5e4e-4939-8f96-902c5a3ed9e2-catalog-content\") pod \"redhat-operators-n5sn7\" (UID: \"d85389b5-5e4e-4939-8f96-902c5a3ed9e2\") " pod="openshift-marketplace/redhat-operators-n5sn7" Oct 11 03:55:59 crc kubenswrapper[4703]: I1011 03:55:59.985861 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d85389b5-5e4e-4939-8f96-902c5a3ed9e2-utilities\") pod \"redhat-operators-n5sn7\" (UID: \"d85389b5-5e4e-4939-8f96-902c5a3ed9e2\") " pod="openshift-marketplace/redhat-operators-n5sn7" Oct 11 03:55:59 crc kubenswrapper[4703]: I1011 03:55:59.985996 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jwqgx\" (UniqueName: \"kubernetes.io/projected/d85389b5-5e4e-4939-8f96-902c5a3ed9e2-kube-api-access-jwqgx\") pod \"redhat-operators-n5sn7\" (UID: \"d85389b5-5e4e-4939-8f96-902c5a3ed9e2\") " pod="openshift-marketplace/redhat-operators-n5sn7" Oct 11 03:55:59 crc kubenswrapper[4703]: I1011 03:55:59.995434 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d85389b5-5e4e-4939-8f96-902c5a3ed9e2-utilities\") pod \"redhat-operators-n5sn7\" (UID: \"d85389b5-5e4e-4939-8f96-902c5a3ed9e2\") " pod="openshift-marketplace/redhat-operators-n5sn7" Oct 11 03:56:00 crc kubenswrapper[4703]: I1011 03:55:59.986983 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d85389b5-5e4e-4939-8f96-902c5a3ed9e2-catalog-content\") pod \"redhat-operators-n5sn7\" (UID: \"d85389b5-5e4e-4939-8f96-902c5a3ed9e2\") " pod="openshift-marketplace/redhat-operators-n5sn7" Oct 11 03:56:00 crc kubenswrapper[4703]: I1011 03:56:00.026539 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jwqgx\" (UniqueName: \"kubernetes.io/projected/d85389b5-5e4e-4939-8f96-902c5a3ed9e2-kube-api-access-jwqgx\") pod \"redhat-operators-n5sn7\" (UID: \"d85389b5-5e4e-4939-8f96-902c5a3ed9e2\") " pod="openshift-marketplace/redhat-operators-n5sn7" Oct 11 03:56:00 crc kubenswrapper[4703]: I1011 03:56:00.044407 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-vvrdk"] Oct 11 03:56:00 crc kubenswrapper[4703]: I1011 03:56:00.064712 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q44w9\" (UID: \"92095718-20ba-4b03-b949-e9a26009e283\") " pod="openshift-image-registry/image-registry-697d97f7c8-q44w9" Oct 11 03:56:00 crc kubenswrapper[4703]: I1011 03:56:00.073822 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n5sn7" Oct 11 03:56:00 crc kubenswrapper[4703]: I1011 03:56:00.087601 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ss7pb\" (UniqueName: \"kubernetes.io/projected/fa264969-12b7-415a-a569-0f2cab5d3a30-kube-api-access-ss7pb\") pod \"redhat-operators-v74rb\" (UID: \"fa264969-12b7-415a-a569-0f2cab5d3a30\") " pod="openshift-marketplace/redhat-operators-v74rb" Oct 11 03:56:00 crc kubenswrapper[4703]: I1011 03:56:00.087985 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa264969-12b7-415a-a569-0f2cab5d3a30-utilities\") pod \"redhat-operators-v74rb\" (UID: \"fa264969-12b7-415a-a569-0f2cab5d3a30\") " pod="openshift-marketplace/redhat-operators-v74rb" Oct 11 03:56:00 crc kubenswrapper[4703]: I1011 03:56:00.088156 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa264969-12b7-415a-a569-0f2cab5d3a30-catalog-content\") pod \"redhat-operators-v74rb\" (UID: \"fa264969-12b7-415a-a569-0f2cab5d3a30\") " pod="openshift-marketplace/redhat-operators-v74rb" Oct 11 03:56:00 crc kubenswrapper[4703]: I1011 03:56:00.135516 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-q44w9" Oct 11 03:56:00 crc kubenswrapper[4703]: I1011 03:56:00.143366 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"86958692-3361-419f-87ef-9b4fe71f9d62","Type":"ContainerStarted","Data":"0533be961534cfd0692c5caa45eb3bf6fb322a70a598d7385800488e30135541"} Oct 11 03:56:00 crc kubenswrapper[4703]: I1011 03:56:00.164822 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vvrdk" event={"ID":"fd430f8d-f9b2-4b4e-9966-5fd12e03e0df","Type":"ContainerStarted","Data":"9b1977fa31bb3f56043fd0de3e1541bfc7189b16e226297b642ff1fd43381ecd"} Oct 11 03:56:00 crc kubenswrapper[4703]: I1011 03:56:00.167397 4703 generic.go:334] "Generic (PLEG): container finished" podID="9307554f-a116-4c31-9585-c712558479fc" containerID="1cd0e4c1ea7ee1bee94aa85f64857d65d2ad9699bb41abd3f98995c2bc634241" exitCode=0 Oct 11 03:56:00 crc kubenswrapper[4703]: I1011 03:56:00.167530 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qnqtn" event={"ID":"9307554f-a116-4c31-9585-c712558479fc","Type":"ContainerDied","Data":"1cd0e4c1ea7ee1bee94aa85f64857d65d2ad9699bb41abd3f98995c2bc634241"} Oct 11 03:56:00 crc kubenswrapper[4703]: I1011 03:56:00.167554 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qnqtn" event={"ID":"9307554f-a116-4c31-9585-c712558479fc","Type":"ContainerStarted","Data":"02d38393516d074048165b50fc80e058f6037a52304aea6e369644d4bbc6cfbe"} Oct 11 03:56:00 crc kubenswrapper[4703]: I1011 03:56:00.186570 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-728m8" event={"ID":"59fc390f-24d6-4f16-855b-475f50c6f4b0","Type":"ContainerStarted","Data":"6cd6bebefa26473ed2c46eabd5a38a5b1a072a25b24a8fc4acd0bf3cab9831ea"} Oct 11 03:56:00 crc kubenswrapper[4703]: I1011 03:56:00.186680 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-728m8" event={"ID":"59fc390f-24d6-4f16-855b-475f50c6f4b0","Type":"ContainerStarted","Data":"aa92908d6e1989f117c3dabab15859a48b3f34eb8152222dd11d43086e30c340"} Oct 11 03:56:00 crc kubenswrapper[4703]: I1011 03:56:00.189224 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 11 03:56:00 crc kubenswrapper[4703]: I1011 03:56:00.189273 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 11 03:56:00 crc kubenswrapper[4703]: I1011 03:56:00.189297 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ss7pb\" (UniqueName: \"kubernetes.io/projected/fa264969-12b7-415a-a569-0f2cab5d3a30-kube-api-access-ss7pb\") pod \"redhat-operators-v74rb\" (UID: \"fa264969-12b7-415a-a569-0f2cab5d3a30\") " pod="openshift-marketplace/redhat-operators-v74rb" Oct 11 03:56:00 crc kubenswrapper[4703]: I1011 03:56:00.189318 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 11 03:56:00 crc kubenswrapper[4703]: I1011 03:56:00.189341 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa264969-12b7-415a-a569-0f2cab5d3a30-utilities\") pod \"redhat-operators-v74rb\" (UID: \"fa264969-12b7-415a-a569-0f2cab5d3a30\") " pod="openshift-marketplace/redhat-operators-v74rb" Oct 11 03:56:00 crc kubenswrapper[4703]: I1011 03:56:00.189396 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa264969-12b7-415a-a569-0f2cab5d3a30-catalog-content\") pod \"redhat-operators-v74rb\" (UID: \"fa264969-12b7-415a-a569-0f2cab5d3a30\") " pod="openshift-marketplace/redhat-operators-v74rb" Oct 11 03:56:00 crc kubenswrapper[4703]: I1011 03:56:00.189876 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa264969-12b7-415a-a569-0f2cab5d3a30-catalog-content\") pod \"redhat-operators-v74rb\" (UID: \"fa264969-12b7-415a-a569-0f2cab5d3a30\") " pod="openshift-marketplace/redhat-operators-v74rb" Oct 11 03:56:00 crc kubenswrapper[4703]: I1011 03:56:00.190704 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 11 03:56:00 crc kubenswrapper[4703]: I1011 03:56:00.190981 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa264969-12b7-415a-a569-0f2cab5d3a30-utilities\") pod \"redhat-operators-v74rb\" (UID: \"fa264969-12b7-415a-a569-0f2cab5d3a30\") " pod="openshift-marketplace/redhat-operators-v74rb" Oct 11 03:56:00 crc kubenswrapper[4703]: I1011 03:56:00.200678 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 11 03:56:00 crc kubenswrapper[4703]: I1011 03:56:00.203034 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 11 03:56:00 crc kubenswrapper[4703]: I1011 03:56:00.210859 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ss7pb\" (UniqueName: \"kubernetes.io/projected/fa264969-12b7-415a-a569-0f2cab5d3a30-kube-api-access-ss7pb\") pod \"redhat-operators-v74rb\" (UID: \"fa264969-12b7-415a-a569-0f2cab5d3a30\") " pod="openshift-marketplace/redhat-operators-v74rb" Oct 11 03:56:00 crc kubenswrapper[4703]: I1011 03:56:00.224244 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-728m8" podStartSLOduration=11.22422854 podStartE2EDuration="11.22422854s" podCreationTimestamp="2025-10-11 03:55:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 03:56:00.223354636 +0000 UTC m=+51.433836568" watchObservedRunningTime="2025-10-11 03:56:00.22422854 +0000 UTC m=+51.434710462" Oct 11 03:56:00 crc kubenswrapper[4703]: I1011 03:56:00.320678 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-v74rb" Oct 11 03:56:00 crc kubenswrapper[4703]: I1011 03:56:00.392689 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 11 03:56:00 crc kubenswrapper[4703]: I1011 03:56:00.400560 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 11 03:56:00 crc kubenswrapper[4703]: I1011 03:56:00.428785 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-n5sn7"] Oct 11 03:56:00 crc kubenswrapper[4703]: I1011 03:56:00.456390 4703 patch_prober.go:28] interesting pod/router-default-5444994796-mlb5q container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 11 03:56:00 crc kubenswrapper[4703]: [-]has-synced failed: reason withheld Oct 11 03:56:00 crc kubenswrapper[4703]: [+]process-running ok Oct 11 03:56:00 crc kubenswrapper[4703]: healthz check failed Oct 11 03:56:00 crc kubenswrapper[4703]: I1011 03:56:00.456434 4703 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-mlb5q" podUID="3516a340-0474-422a-bc3c-4424e08c17b0" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 11 03:56:00 crc kubenswrapper[4703]: I1011 03:56:00.456963 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 11 03:56:00 crc kubenswrapper[4703]: W1011 03:56:00.460600 4703 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd85389b5_5e4e_4939_8f96_902c5a3ed9e2.slice/crio-f21f8d8a91f7e64a365ba63004c6223aba654e8e41fd0121a5ed8c0706b7547f WatchSource:0}: Error finding container f21f8d8a91f7e64a365ba63004c6223aba654e8e41fd0121a5ed8c0706b7547f: Status 404 returned error can't find the container with id f21f8d8a91f7e64a365ba63004c6223aba654e8e41fd0121a5ed8c0706b7547f Oct 11 03:56:00 crc kubenswrapper[4703]: I1011 03:56:00.469560 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 11 03:56:00 crc kubenswrapper[4703]: I1011 03:56:00.484608 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 11 03:56:00 crc kubenswrapper[4703]: I1011 03:56:00.750200 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-q44w9"] Oct 11 03:56:00 crc kubenswrapper[4703]: I1011 03:56:00.857107 4703 patch_prober.go:28] interesting pod/downloads-7954f5f757-dnf85 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.5:8080/\": dial tcp 10.217.0.5:8080: connect: connection refused" start-of-body= Oct 11 03:56:00 crc kubenswrapper[4703]: I1011 03:56:00.857568 4703 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-dnf85" podUID="f5fe968e-3846-4f87-a7b9-4fdb9d945eab" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.5:8080/\": dial tcp 10.217.0.5:8080: connect: connection refused" Oct 11 03:56:00 crc kubenswrapper[4703]: I1011 03:56:00.857134 4703 patch_prober.go:28] interesting pod/downloads-7954f5f757-dnf85 container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.5:8080/\": dial tcp 10.217.0.5:8080: connect: connection refused" start-of-body= Oct 11 03:56:00 crc kubenswrapper[4703]: I1011 03:56:00.857904 4703 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-dnf85" podUID="f5fe968e-3846-4f87-a7b9-4fdb9d945eab" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.5:8080/\": dial tcp 10.217.0.5:8080: connect: connection refused" Oct 11 03:56:00 crc kubenswrapper[4703]: I1011 03:56:00.894012 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-v74rb"] Oct 11 03:56:00 crc kubenswrapper[4703]: W1011 03:56:00.928572 4703 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfa264969_12b7_415a_a569_0f2cab5d3a30.slice/crio-2e1bd993e04cf8bfa6730bf30d15c5b8282d3bcc0a0cbd8ff1923150a7a9ef69 WatchSource:0}: Error finding container 2e1bd993e04cf8bfa6730bf30d15c5b8282d3bcc0a0cbd8ff1923150a7a9ef69: Status 404 returned error can't find the container with id 2e1bd993e04cf8bfa6730bf30d15c5b8282d3bcc0a0cbd8ff1923150a7a9ef69 Oct 11 03:56:00 crc kubenswrapper[4703]: W1011 03:56:00.932928 4703 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d751cbb_f2e2_430d_9754_c882a5e924a5.slice/crio-91d65756fc083dde2d772d93e2fdaf9bf1a02c494b6a49fab3da1f70d9d769b5 WatchSource:0}: Error finding container 91d65756fc083dde2d772d93e2fdaf9bf1a02c494b6a49fab3da1f70d9d769b5: Status 404 returned error can't find the container with id 91d65756fc083dde2d772d93e2fdaf9bf1a02c494b6a49fab3da1f70d9d769b5 Oct 11 03:56:01 crc kubenswrapper[4703]: W1011 03:56:01.008612 4703 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b6479f0_333b_4a96_9adf_2099afdc2447.slice/crio-52331d83e214888fce38f4c3fb8d49af07dfe5bf5cfc39198299054523de279f WatchSource:0}: Error finding container 52331d83e214888fce38f4c3fb8d49af07dfe5bf5cfc39198299054523de279f: Status 404 returned error can't find the container with id 52331d83e214888fce38f4c3fb8d49af07dfe5bf5cfc39198299054523de279f Oct 11 03:56:01 crc kubenswrapper[4703]: I1011 03:56:01.220737 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v74rb" event={"ID":"fa264969-12b7-415a-a569-0f2cab5d3a30","Type":"ContainerStarted","Data":"2e1bd993e04cf8bfa6730bf30d15c5b8282d3bcc0a0cbd8ff1923150a7a9ef69"} Oct 11 03:56:01 crc kubenswrapper[4703]: I1011 03:56:01.230054 4703 generic.go:334] "Generic (PLEG): container finished" podID="86958692-3361-419f-87ef-9b4fe71f9d62" containerID="2a971c599126effd755cace9f91bb20eb438ede32154aece39f0cee0fd4ad249" exitCode=0 Oct 11 03:56:01 crc kubenswrapper[4703]: I1011 03:56:01.230152 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"86958692-3361-419f-87ef-9b4fe71f9d62","Type":"ContainerDied","Data":"2a971c599126effd755cace9f91bb20eb438ede32154aece39f0cee0fd4ad249"} Oct 11 03:56:01 crc kubenswrapper[4703]: I1011 03:56:01.235187 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"91d65756fc083dde2d772d93e2fdaf9bf1a02c494b6a49fab3da1f70d9d769b5"} Oct 11 03:56:01 crc kubenswrapper[4703]: I1011 03:56:01.238063 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-q44w9" event={"ID":"92095718-20ba-4b03-b949-e9a26009e283","Type":"ContainerStarted","Data":"962b74a79ca119d66ddf70d2af94ff70995d285c975176afdbf4aefcc6769da6"} Oct 11 03:56:01 crc kubenswrapper[4703]: I1011 03:56:01.238092 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-q44w9" event={"ID":"92095718-20ba-4b03-b949-e9a26009e283","Type":"ContainerStarted","Data":"e758d79d1a62e78e6af72ad4ce6eeca1a5922845895f24b985711e2f6bf0a285"} Oct 11 03:56:01 crc kubenswrapper[4703]: I1011 03:56:01.238375 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-q44w9" Oct 11 03:56:01 crc kubenswrapper[4703]: I1011 03:56:01.240683 4703 generic.go:334] "Generic (PLEG): container finished" podID="fd430f8d-f9b2-4b4e-9966-5fd12e03e0df" containerID="4e075e38853675e2a56ba0339dca4d8729e9e04415f7da377ccb747be235d7f5" exitCode=0 Oct 11 03:56:01 crc kubenswrapper[4703]: I1011 03:56:01.240869 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vvrdk" event={"ID":"fd430f8d-f9b2-4b4e-9966-5fd12e03e0df","Type":"ContainerDied","Data":"4e075e38853675e2a56ba0339dca4d8729e9e04415f7da377ccb747be235d7f5"} Oct 11 03:56:01 crc kubenswrapper[4703]: I1011 03:56:01.245109 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"52331d83e214888fce38f4c3fb8d49af07dfe5bf5cfc39198299054523de279f"} Oct 11 03:56:01 crc kubenswrapper[4703]: I1011 03:56:01.247214 4703 generic.go:334] "Generic (PLEG): container finished" podID="d85389b5-5e4e-4939-8f96-902c5a3ed9e2" containerID="996e01e05a22e6cfbffdb340157f7e2c20ddfc5cfa2fb8fdef34e3dd16c8603c" exitCode=0 Oct 11 03:56:01 crc kubenswrapper[4703]: I1011 03:56:01.247435 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n5sn7" event={"ID":"d85389b5-5e4e-4939-8f96-902c5a3ed9e2","Type":"ContainerDied","Data":"996e01e05a22e6cfbffdb340157f7e2c20ddfc5cfa2fb8fdef34e3dd16c8603c"} Oct 11 03:56:01 crc kubenswrapper[4703]: I1011 03:56:01.247550 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n5sn7" event={"ID":"d85389b5-5e4e-4939-8f96-902c5a3ed9e2","Type":"ContainerStarted","Data":"f21f8d8a91f7e64a365ba63004c6223aba654e8e41fd0121a5ed8c0706b7547f"} Oct 11 03:56:01 crc kubenswrapper[4703]: I1011 03:56:01.250381 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"5e71b501690b52a41f0f626992027669b960105fd6ae1755a859001262291cdf"} Oct 11 03:56:01 crc kubenswrapper[4703]: I1011 03:56:01.306764 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-q44w9" podStartSLOduration=26.306741287 podStartE2EDuration="26.306741287s" podCreationTimestamp="2025-10-11 03:55:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 03:56:01.29368759 +0000 UTC m=+52.504169512" watchObservedRunningTime="2025-10-11 03:56:01.306741287 +0000 UTC m=+52.517223209" Oct 11 03:56:01 crc kubenswrapper[4703]: I1011 03:56:01.456391 4703 patch_prober.go:28] interesting pod/router-default-5444994796-mlb5q container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 11 03:56:01 crc kubenswrapper[4703]: [-]has-synced failed: reason withheld Oct 11 03:56:01 crc kubenswrapper[4703]: [+]process-running ok Oct 11 03:56:01 crc kubenswrapper[4703]: healthz check failed Oct 11 03:56:01 crc kubenswrapper[4703]: I1011 03:56:01.456441 4703 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-mlb5q" podUID="3516a340-0474-422a-bc3c-4424e08c17b0" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 11 03:56:01 crc kubenswrapper[4703]: I1011 03:56:01.546685 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Oct 11 03:56:01 crc kubenswrapper[4703]: I1011 03:56:01.735946 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Oct 11 03:56:01 crc kubenswrapper[4703]: I1011 03:56:01.736695 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 11 03:56:01 crc kubenswrapper[4703]: I1011 03:56:01.737951 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Oct 11 03:56:01 crc kubenswrapper[4703]: I1011 03:56:01.742484 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Oct 11 03:56:01 crc kubenswrapper[4703]: I1011 03:56:01.742671 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Oct 11 03:56:01 crc kubenswrapper[4703]: I1011 03:56:01.829864 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1809da0d-af06-47ca-bab2-d26183265884-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"1809da0d-af06-47ca-bab2-d26183265884\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 11 03:56:01 crc kubenswrapper[4703]: I1011 03:56:01.830197 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1809da0d-af06-47ca-bab2-d26183265884-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"1809da0d-af06-47ca-bab2-d26183265884\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 11 03:56:01 crc kubenswrapper[4703]: I1011 03:56:01.888263 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-m5qlx" Oct 11 03:56:01 crc kubenswrapper[4703]: I1011 03:56:01.888315 4703 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-m5qlx" Oct 11 03:56:01 crc kubenswrapper[4703]: I1011 03:56:01.890442 4703 patch_prober.go:28] interesting pod/console-f9d7485db-m5qlx container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.19:8443/health\": dial tcp 10.217.0.19:8443: connect: connection refused" start-of-body= Oct 11 03:56:01 crc kubenswrapper[4703]: I1011 03:56:01.890508 4703 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-m5qlx" podUID="34cb015e-a7ba-4cea-a2fc-ab3b101299ce" containerName="console" probeResult="failure" output="Get \"https://10.217.0.19:8443/health\": dial tcp 10.217.0.19:8443: connect: connection refused" Oct 11 03:56:01 crc kubenswrapper[4703]: I1011 03:56:01.913559 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-pb92d" Oct 11 03:56:01 crc kubenswrapper[4703]: I1011 03:56:01.913621 4703 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-pb92d" Oct 11 03:56:01 crc kubenswrapper[4703]: I1011 03:56:01.926862 4703 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-pb92d" Oct 11 03:56:01 crc kubenswrapper[4703]: I1011 03:56:01.940641 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1809da0d-af06-47ca-bab2-d26183265884-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"1809da0d-af06-47ca-bab2-d26183265884\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 11 03:56:01 crc kubenswrapper[4703]: I1011 03:56:01.940733 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1809da0d-af06-47ca-bab2-d26183265884-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"1809da0d-af06-47ca-bab2-d26183265884\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 11 03:56:01 crc kubenswrapper[4703]: I1011 03:56:01.940798 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1809da0d-af06-47ca-bab2-d26183265884-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"1809da0d-af06-47ca-bab2-d26183265884\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 11 03:56:01 crc kubenswrapper[4703]: E1011 03:56:01.947482 4703 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="a40d99f678ccb88a8e3c7876ce75ddc6fa397760b26ff87effa4a1ed05d7f35b" cmd=["/bin/bash","-c","test -f /ready/ready"] Oct 11 03:56:01 crc kubenswrapper[4703]: E1011 03:56:01.949933 4703 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="a40d99f678ccb88a8e3c7876ce75ddc6fa397760b26ff87effa4a1ed05d7f35b" cmd=["/bin/bash","-c","test -f /ready/ready"] Oct 11 03:56:01 crc kubenswrapper[4703]: I1011 03:56:01.962582 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1809da0d-af06-47ca-bab2-d26183265884-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"1809da0d-af06-47ca-bab2-d26183265884\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 11 03:56:01 crc kubenswrapper[4703]: E1011 03:56:01.967015 4703 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="a40d99f678ccb88a8e3c7876ce75ddc6fa397760b26ff87effa4a1ed05d7f35b" cmd=["/bin/bash","-c","test -f /ready/ready"] Oct 11 03:56:01 crc kubenswrapper[4703]: E1011 03:56:01.967061 4703 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-69n87" podUID="01618421-523c-44ab-b631-f2624ba8ab2d" containerName="kube-multus-additional-cni-plugins" Oct 11 03:56:02 crc kubenswrapper[4703]: I1011 03:56:02.015370 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4fv46" Oct 11 03:56:02 crc kubenswrapper[4703]: I1011 03:56:02.015438 4703 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4fv46" Oct 11 03:56:02 crc kubenswrapper[4703]: I1011 03:56:02.048533 4703 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4fv46" Oct 11 03:56:02 crc kubenswrapper[4703]: I1011 03:56:02.063446 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 11 03:56:02 crc kubenswrapper[4703]: I1011 03:56:02.261121 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"91578788778cba5e740db9395ea01154324df633f15655fcd28bc9ea5115a11c"} Oct 11 03:56:02 crc kubenswrapper[4703]: I1011 03:56:02.271756 4703 generic.go:334] "Generic (PLEG): container finished" podID="fa264969-12b7-415a-a569-0f2cab5d3a30" containerID="4144da6a1a0db56cd51cfde488ed271f6d130911f2e33b37e8fc88be8fcff3d9" exitCode=0 Oct 11 03:56:02 crc kubenswrapper[4703]: I1011 03:56:02.271818 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v74rb" event={"ID":"fa264969-12b7-415a-a569-0f2cab5d3a30","Type":"ContainerDied","Data":"4144da6a1a0db56cd51cfde488ed271f6d130911f2e33b37e8fc88be8fcff3d9"} Oct 11 03:56:02 crc kubenswrapper[4703]: I1011 03:56:02.276817 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"3d70280e3bee22e7dc34904671364d933da856c4bf95c712ac3992907f8705fc"} Oct 11 03:56:02 crc kubenswrapper[4703]: I1011 03:56:02.284773 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"293c21bf66b12e30e4106c433d94750b04798ceb17f0cb46a212687c842b4e30"} Oct 11 03:56:02 crc kubenswrapper[4703]: I1011 03:56:02.284815 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 11 03:56:02 crc kubenswrapper[4703]: I1011 03:56:02.290397 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-pb92d" Oct 11 03:56:02 crc kubenswrapper[4703]: I1011 03:56:02.291287 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4fv46" Oct 11 03:56:02 crc kubenswrapper[4703]: I1011 03:56:02.454307 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-mlb5q" Oct 11 03:56:02 crc kubenswrapper[4703]: I1011 03:56:02.468632 4703 patch_prober.go:28] interesting pod/router-default-5444994796-mlb5q container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 11 03:56:02 crc kubenswrapper[4703]: [-]has-synced failed: reason withheld Oct 11 03:56:02 crc kubenswrapper[4703]: [+]process-running ok Oct 11 03:56:02 crc kubenswrapper[4703]: healthz check failed Oct 11 03:56:02 crc kubenswrapper[4703]: I1011 03:56:02.468688 4703 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-mlb5q" podUID="3516a340-0474-422a-bc3c-4424e08c17b0" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 11 03:56:02 crc kubenswrapper[4703]: I1011 03:56:02.476718 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-tjd96" Oct 11 03:56:02 crc kubenswrapper[4703]: I1011 03:56:02.529705 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Oct 11 03:56:02 crc kubenswrapper[4703]: I1011 03:56:02.877667 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 11 03:56:02 crc kubenswrapper[4703]: I1011 03:56:02.961593 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/86958692-3361-419f-87ef-9b4fe71f9d62-kube-api-access\") pod \"86958692-3361-419f-87ef-9b4fe71f9d62\" (UID: \"86958692-3361-419f-87ef-9b4fe71f9d62\") " Oct 11 03:56:02 crc kubenswrapper[4703]: I1011 03:56:02.961644 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/86958692-3361-419f-87ef-9b4fe71f9d62-kubelet-dir\") pod \"86958692-3361-419f-87ef-9b4fe71f9d62\" (UID: \"86958692-3361-419f-87ef-9b4fe71f9d62\") " Oct 11 03:56:02 crc kubenswrapper[4703]: I1011 03:56:02.961794 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/86958692-3361-419f-87ef-9b4fe71f9d62-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "86958692-3361-419f-87ef-9b4fe71f9d62" (UID: "86958692-3361-419f-87ef-9b4fe71f9d62"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 11 03:56:02 crc kubenswrapper[4703]: I1011 03:56:02.967708 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/86958692-3361-419f-87ef-9b4fe71f9d62-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "86958692-3361-419f-87ef-9b4fe71f9d62" (UID: "86958692-3361-419f-87ef-9b4fe71f9d62"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 03:56:03 crc kubenswrapper[4703]: I1011 03:56:03.062902 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/86958692-3361-419f-87ef-9b4fe71f9d62-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 11 03:56:03 crc kubenswrapper[4703]: I1011 03:56:03.062935 4703 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/86958692-3361-419f-87ef-9b4fe71f9d62-kubelet-dir\") on node \"crc\" DevicePath \"\"" Oct 11 03:56:03 crc kubenswrapper[4703]: I1011 03:56:03.319458 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"1809da0d-af06-47ca-bab2-d26183265884","Type":"ContainerStarted","Data":"6bd46c9a3b658c34f9edd1a9393e8deb68a018ff4ea920088b66195bddf955f3"} Oct 11 03:56:03 crc kubenswrapper[4703]: I1011 03:56:03.325000 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 11 03:56:03 crc kubenswrapper[4703]: I1011 03:56:03.325993 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"86958692-3361-419f-87ef-9b4fe71f9d62","Type":"ContainerDied","Data":"0533be961534cfd0692c5caa45eb3bf6fb322a70a598d7385800488e30135541"} Oct 11 03:56:03 crc kubenswrapper[4703]: I1011 03:56:03.326069 4703 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0533be961534cfd0692c5caa45eb3bf6fb322a70a598d7385800488e30135541" Oct 11 03:56:03 crc kubenswrapper[4703]: I1011 03:56:03.456132 4703 patch_prober.go:28] interesting pod/router-default-5444994796-mlb5q container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 11 03:56:03 crc kubenswrapper[4703]: [-]has-synced failed: reason withheld Oct 11 03:56:03 crc kubenswrapper[4703]: [+]process-running ok Oct 11 03:56:03 crc kubenswrapper[4703]: healthz check failed Oct 11 03:56:03 crc kubenswrapper[4703]: I1011 03:56:03.456193 4703 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-mlb5q" podUID="3516a340-0474-422a-bc3c-4424e08c17b0" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 11 03:56:04 crc kubenswrapper[4703]: I1011 03:56:04.340694 4703 generic.go:334] "Generic (PLEG): container finished" podID="1809da0d-af06-47ca-bab2-d26183265884" containerID="67e9d27c8872523fe546cc3688006652e837ac0caf5838eb5881fcfb80517d89" exitCode=0 Oct 11 03:56:04 crc kubenswrapper[4703]: I1011 03:56:04.341784 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"1809da0d-af06-47ca-bab2-d26183265884","Type":"ContainerDied","Data":"67e9d27c8872523fe546cc3688006652e837ac0caf5838eb5881fcfb80517d89"} Oct 11 03:56:04 crc kubenswrapper[4703]: I1011 03:56:04.365306 4703 generic.go:334] "Generic (PLEG): container finished" podID="83320069-4d0c-4e40-8a7f-b3bc4beda7a1" containerID="2eb9f8e6d465e4ccdd205ae0aeeca09ce9e2af9ecf85617551218ef952761320" exitCode=0 Oct 11 03:56:04 crc kubenswrapper[4703]: I1011 03:56:04.365406 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29335905-4jslx" event={"ID":"83320069-4d0c-4e40-8a7f-b3bc4beda7a1","Type":"ContainerDied","Data":"2eb9f8e6d465e4ccdd205ae0aeeca09ce9e2af9ecf85617551218ef952761320"} Oct 11 03:56:04 crc kubenswrapper[4703]: I1011 03:56:04.454991 4703 patch_prober.go:28] interesting pod/router-default-5444994796-mlb5q container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 11 03:56:04 crc kubenswrapper[4703]: [-]has-synced failed: reason withheld Oct 11 03:56:04 crc kubenswrapper[4703]: [+]process-running ok Oct 11 03:56:04 crc kubenswrapper[4703]: healthz check failed Oct 11 03:56:04 crc kubenswrapper[4703]: I1011 03:56:04.455040 4703 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-mlb5q" podUID="3516a340-0474-422a-bc3c-4424e08c17b0" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 11 03:56:05 crc kubenswrapper[4703]: I1011 03:56:05.165272 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 11 03:56:05 crc kubenswrapper[4703]: I1011 03:56:05.195965 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Oct 11 03:56:05 crc kubenswrapper[4703]: I1011 03:56:05.464246 4703 patch_prober.go:28] interesting pod/router-default-5444994796-mlb5q container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 11 03:56:05 crc kubenswrapper[4703]: [-]has-synced failed: reason withheld Oct 11 03:56:05 crc kubenswrapper[4703]: [+]process-running ok Oct 11 03:56:05 crc kubenswrapper[4703]: healthz check failed Oct 11 03:56:05 crc kubenswrapper[4703]: I1011 03:56:05.464320 4703 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-mlb5q" podUID="3516a340-0474-422a-bc3c-4424e08c17b0" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 11 03:56:06 crc kubenswrapper[4703]: I1011 03:56:06.455316 4703 patch_prober.go:28] interesting pod/router-default-5444994796-mlb5q container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 11 03:56:06 crc kubenswrapper[4703]: [-]has-synced failed: reason withheld Oct 11 03:56:06 crc kubenswrapper[4703]: [+]process-running ok Oct 11 03:56:06 crc kubenswrapper[4703]: healthz check failed Oct 11 03:56:06 crc kubenswrapper[4703]: I1011 03:56:06.455692 4703 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-mlb5q" podUID="3516a340-0474-422a-bc3c-4424e08c17b0" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 11 03:56:07 crc kubenswrapper[4703]: I1011 03:56:07.262137 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-rf7jf" Oct 11 03:56:07 crc kubenswrapper[4703]: I1011 03:56:07.312831 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=2.312813253 podStartE2EDuration="2.312813253s" podCreationTimestamp="2025-10-11 03:56:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 03:56:07.312315339 +0000 UTC m=+58.522797261" watchObservedRunningTime="2025-10-11 03:56:07.312813253 +0000 UTC m=+58.523295175" Oct 11 03:56:07 crc kubenswrapper[4703]: I1011 03:56:07.455272 4703 patch_prober.go:28] interesting pod/router-default-5444994796-mlb5q container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 11 03:56:07 crc kubenswrapper[4703]: [-]has-synced failed: reason withheld Oct 11 03:56:07 crc kubenswrapper[4703]: [+]process-running ok Oct 11 03:56:07 crc kubenswrapper[4703]: healthz check failed Oct 11 03:56:07 crc kubenswrapper[4703]: I1011 03:56:07.455416 4703 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-mlb5q" podUID="3516a340-0474-422a-bc3c-4424e08c17b0" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 11 03:56:08 crc kubenswrapper[4703]: I1011 03:56:08.456032 4703 patch_prober.go:28] interesting pod/router-default-5444994796-mlb5q container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 11 03:56:08 crc kubenswrapper[4703]: [-]has-synced failed: reason withheld Oct 11 03:56:08 crc kubenswrapper[4703]: [+]process-running ok Oct 11 03:56:08 crc kubenswrapper[4703]: healthz check failed Oct 11 03:56:08 crc kubenswrapper[4703]: I1011 03:56:08.456104 4703 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-mlb5q" podUID="3516a340-0474-422a-bc3c-4424e08c17b0" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 11 03:56:09 crc kubenswrapper[4703]: I1011 03:56:09.459212 4703 patch_prober.go:28] interesting pod/router-default-5444994796-mlb5q container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 11 03:56:09 crc kubenswrapper[4703]: [-]has-synced failed: reason withheld Oct 11 03:56:09 crc kubenswrapper[4703]: [+]process-running ok Oct 11 03:56:09 crc kubenswrapper[4703]: healthz check failed Oct 11 03:56:09 crc kubenswrapper[4703]: I1011 03:56:09.459663 4703 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-mlb5q" podUID="3516a340-0474-422a-bc3c-4424e08c17b0" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 11 03:56:10 crc kubenswrapper[4703]: I1011 03:56:10.455860 4703 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-mlb5q" Oct 11 03:56:10 crc kubenswrapper[4703]: I1011 03:56:10.458473 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-mlb5q" Oct 11 03:56:10 crc kubenswrapper[4703]: I1011 03:56:10.860187 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-dnf85" Oct 11 03:56:11 crc kubenswrapper[4703]: I1011 03:56:11.887692 4703 patch_prober.go:28] interesting pod/console-f9d7485db-m5qlx container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.19:8443/health\": dial tcp 10.217.0.19:8443: connect: connection refused" start-of-body= Oct 11 03:56:11 crc kubenswrapper[4703]: I1011 03:56:11.887768 4703 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-m5qlx" podUID="34cb015e-a7ba-4cea-a2fc-ab3b101299ce" containerName="console" probeResult="failure" output="Get \"https://10.217.0.19:8443/health\": dial tcp 10.217.0.19:8443: connect: connection refused" Oct 11 03:56:11 crc kubenswrapper[4703]: E1011 03:56:11.940990 4703 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="a40d99f678ccb88a8e3c7876ce75ddc6fa397760b26ff87effa4a1ed05d7f35b" cmd=["/bin/bash","-c","test -f /ready/ready"] Oct 11 03:56:11 crc kubenswrapper[4703]: E1011 03:56:11.942774 4703 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="a40d99f678ccb88a8e3c7876ce75ddc6fa397760b26ff87effa4a1ed05d7f35b" cmd=["/bin/bash","-c","test -f /ready/ready"] Oct 11 03:56:11 crc kubenswrapper[4703]: E1011 03:56:11.945538 4703 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="a40d99f678ccb88a8e3c7876ce75ddc6fa397760b26ff87effa4a1ed05d7f35b" cmd=["/bin/bash","-c","test -f /ready/ready"] Oct 11 03:56:11 crc kubenswrapper[4703]: E1011 03:56:11.945570 4703 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-69n87" podUID="01618421-523c-44ab-b631-f2624ba8ab2d" containerName="kube-multus-additional-cni-plugins" Oct 11 03:56:14 crc kubenswrapper[4703]: I1011 03:56:14.549122 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Oct 11 03:56:19 crc kubenswrapper[4703]: I1011 03:56:19.357579 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 11 03:56:19 crc kubenswrapper[4703]: I1011 03:56:19.361453 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1809da0d-af06-47ca-bab2-d26183265884-kubelet-dir\") pod \"1809da0d-af06-47ca-bab2-d26183265884\" (UID: \"1809da0d-af06-47ca-bab2-d26183265884\") " Oct 11 03:56:19 crc kubenswrapper[4703]: I1011 03:56:19.361522 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1809da0d-af06-47ca-bab2-d26183265884-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "1809da0d-af06-47ca-bab2-d26183265884" (UID: "1809da0d-af06-47ca-bab2-d26183265884"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 11 03:56:19 crc kubenswrapper[4703]: I1011 03:56:19.361742 4703 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1809da0d-af06-47ca-bab2-d26183265884-kubelet-dir\") on node \"crc\" DevicePath \"\"" Oct 11 03:56:19 crc kubenswrapper[4703]: I1011 03:56:19.363496 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29335905-4jslx" Oct 11 03:56:19 crc kubenswrapper[4703]: I1011 03:56:19.373681 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=5.373654676 podStartE2EDuration="5.373654676s" podCreationTimestamp="2025-10-11 03:56:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 03:56:19.37269985 +0000 UTC m=+70.583181772" watchObservedRunningTime="2025-10-11 03:56:19.373654676 +0000 UTC m=+70.584136638" Oct 11 03:56:19 crc kubenswrapper[4703]: I1011 03:56:19.462410 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tbzsb\" (UniqueName: \"kubernetes.io/projected/83320069-4d0c-4e40-8a7f-b3bc4beda7a1-kube-api-access-tbzsb\") pod \"83320069-4d0c-4e40-8a7f-b3bc4beda7a1\" (UID: \"83320069-4d0c-4e40-8a7f-b3bc4beda7a1\") " Oct 11 03:56:19 crc kubenswrapper[4703]: I1011 03:56:19.462536 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1809da0d-af06-47ca-bab2-d26183265884-kube-api-access\") pod \"1809da0d-af06-47ca-bab2-d26183265884\" (UID: \"1809da0d-af06-47ca-bab2-d26183265884\") " Oct 11 03:56:19 crc kubenswrapper[4703]: I1011 03:56:19.462567 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/83320069-4d0c-4e40-8a7f-b3bc4beda7a1-config-volume\") pod \"83320069-4d0c-4e40-8a7f-b3bc4beda7a1\" (UID: \"83320069-4d0c-4e40-8a7f-b3bc4beda7a1\") " Oct 11 03:56:19 crc kubenswrapper[4703]: I1011 03:56:19.462605 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/83320069-4d0c-4e40-8a7f-b3bc4beda7a1-secret-volume\") pod \"83320069-4d0c-4e40-8a7f-b3bc4beda7a1\" (UID: \"83320069-4d0c-4e40-8a7f-b3bc4beda7a1\") " Oct 11 03:56:19 crc kubenswrapper[4703]: I1011 03:56:19.464917 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/83320069-4d0c-4e40-8a7f-b3bc4beda7a1-config-volume" (OuterVolumeSpecName: "config-volume") pod "83320069-4d0c-4e40-8a7f-b3bc4beda7a1" (UID: "83320069-4d0c-4e40-8a7f-b3bc4beda7a1"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 03:56:19 crc kubenswrapper[4703]: I1011 03:56:19.474914 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1809da0d-af06-47ca-bab2-d26183265884-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1809da0d-af06-47ca-bab2-d26183265884" (UID: "1809da0d-af06-47ca-bab2-d26183265884"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 03:56:19 crc kubenswrapper[4703]: I1011 03:56:19.475356 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/83320069-4d0c-4e40-8a7f-b3bc4beda7a1-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "83320069-4d0c-4e40-8a7f-b3bc4beda7a1" (UID: "83320069-4d0c-4e40-8a7f-b3bc4beda7a1"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 03:56:19 crc kubenswrapper[4703]: I1011 03:56:19.475542 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/83320069-4d0c-4e40-8a7f-b3bc4beda7a1-kube-api-access-tbzsb" (OuterVolumeSpecName: "kube-api-access-tbzsb") pod "83320069-4d0c-4e40-8a7f-b3bc4beda7a1" (UID: "83320069-4d0c-4e40-8a7f-b3bc4beda7a1"). InnerVolumeSpecName "kube-api-access-tbzsb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 03:56:19 crc kubenswrapper[4703]: I1011 03:56:19.481184 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"1809da0d-af06-47ca-bab2-d26183265884","Type":"ContainerDied","Data":"6bd46c9a3b658c34f9edd1a9393e8deb68a018ff4ea920088b66195bddf955f3"} Oct 11 03:56:19 crc kubenswrapper[4703]: I1011 03:56:19.481239 4703 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6bd46c9a3b658c34f9edd1a9393e8deb68a018ff4ea920088b66195bddf955f3" Oct 11 03:56:19 crc kubenswrapper[4703]: I1011 03:56:19.481322 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 11 03:56:19 crc kubenswrapper[4703]: I1011 03:56:19.487414 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29335905-4jslx" event={"ID":"83320069-4d0c-4e40-8a7f-b3bc4beda7a1","Type":"ContainerDied","Data":"594a35444e4d33b415b7636780710247c44c4e2ad7c3ef364c4cde3e44ebdcf0"} Oct 11 03:56:19 crc kubenswrapper[4703]: I1011 03:56:19.487455 4703 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="594a35444e4d33b415b7636780710247c44c4e2ad7c3ef364c4cde3e44ebdcf0" Oct 11 03:56:19 crc kubenswrapper[4703]: I1011 03:56:19.487553 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29335905-4jslx" Oct 11 03:56:19 crc kubenswrapper[4703]: I1011 03:56:19.565338 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tbzsb\" (UniqueName: \"kubernetes.io/projected/83320069-4d0c-4e40-8a7f-b3bc4beda7a1-kube-api-access-tbzsb\") on node \"crc\" DevicePath \"\"" Oct 11 03:56:19 crc kubenswrapper[4703]: I1011 03:56:19.565655 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1809da0d-af06-47ca-bab2-d26183265884-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 11 03:56:19 crc kubenswrapper[4703]: I1011 03:56:19.565669 4703 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/83320069-4d0c-4e40-8a7f-b3bc4beda7a1-config-volume\") on node \"crc\" DevicePath \"\"" Oct 11 03:56:19 crc kubenswrapper[4703]: I1011 03:56:19.565684 4703 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/83320069-4d0c-4e40-8a7f-b3bc4beda7a1-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 11 03:56:20 crc kubenswrapper[4703]: I1011 03:56:20.142881 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-q44w9" Oct 11 03:56:21 crc kubenswrapper[4703]: E1011 03:56:21.942604 4703 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="a40d99f678ccb88a8e3c7876ce75ddc6fa397760b26ff87effa4a1ed05d7f35b" cmd=["/bin/bash","-c","test -f /ready/ready"] Oct 11 03:56:21 crc kubenswrapper[4703]: E1011 03:56:21.945258 4703 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="a40d99f678ccb88a8e3c7876ce75ddc6fa397760b26ff87effa4a1ed05d7f35b" cmd=["/bin/bash","-c","test -f /ready/ready"] Oct 11 03:56:21 crc kubenswrapper[4703]: E1011 03:56:21.947941 4703 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="a40d99f678ccb88a8e3c7876ce75ddc6fa397760b26ff87effa4a1ed05d7f35b" cmd=["/bin/bash","-c","test -f /ready/ready"] Oct 11 03:56:21 crc kubenswrapper[4703]: E1011 03:56:21.948017 4703 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-69n87" podUID="01618421-523c-44ab-b631-f2624ba8ab2d" containerName="kube-multus-additional-cni-plugins" Oct 11 03:56:21 crc kubenswrapper[4703]: I1011 03:56:21.969908 4703 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-m5qlx" Oct 11 03:56:21 crc kubenswrapper[4703]: I1011 03:56:21.974955 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-m5qlx" Oct 11 03:56:25 crc kubenswrapper[4703]: E1011 03:56:25.984180 4703 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Oct 11 03:56:25 crc kubenswrapper[4703]: E1011 03:56:25.985237 4703 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ss7pb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-v74rb_openshift-marketplace(fa264969-12b7-415a-a569-0f2cab5d3a30): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 11 03:56:25 crc kubenswrapper[4703]: E1011 03:56:25.986343 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-v74rb" podUID="fa264969-12b7-415a-a569-0f2cab5d3a30" Oct 11 03:56:25 crc kubenswrapper[4703]: E1011 03:56:25.998627 4703 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Oct 11 03:56:25 crc kubenswrapper[4703]: E1011 03:56:25.998864 4703 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-jwqgx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-n5sn7_openshift-marketplace(d85389b5-5e4e-4939-8f96-902c5a3ed9e2): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 11 03:56:26 crc kubenswrapper[4703]: E1011 03:56:26.000064 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-n5sn7" podUID="d85389b5-5e4e-4939-8f96-902c5a3ed9e2" Oct 11 03:56:27 crc kubenswrapper[4703]: E1011 03:56:27.372396 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-n5sn7" podUID="d85389b5-5e4e-4939-8f96-902c5a3ed9e2" Oct 11 03:56:27 crc kubenswrapper[4703]: E1011 03:56:27.372668 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-v74rb" podUID="fa264969-12b7-415a-a569-0f2cab5d3a30" Oct 11 03:56:27 crc kubenswrapper[4703]: E1011 03:56:27.484419 4703 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Oct 11 03:56:27 crc kubenswrapper[4703]: E1011 03:56:27.484584 4703 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-lnblj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-wm7q9_openshift-marketplace(7dfc7ef5-c10c-4282-9857-ad5a2fb6e356): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 11 03:56:27 crc kubenswrapper[4703]: E1011 03:56:27.486240 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-wm7q9" podUID="7dfc7ef5-c10c-4282-9857-ad5a2fb6e356" Oct 11 03:56:27 crc kubenswrapper[4703]: E1011 03:56:27.501818 4703 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Oct 11 03:56:27 crc kubenswrapper[4703]: E1011 03:56:27.501945 4703 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6z74n,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-z4mh8_openshift-marketplace(9301ceea-042d-4d8d-941c-54032cf80047): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 11 03:56:27 crc kubenswrapper[4703]: E1011 03:56:27.503104 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-z4mh8" podUID="9301ceea-042d-4d8d-941c-54032cf80047" Oct 11 03:56:27 crc kubenswrapper[4703]: E1011 03:56:27.537100 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-wm7q9" podUID="7dfc7ef5-c10c-4282-9857-ad5a2fb6e356" Oct 11 03:56:27 crc kubenswrapper[4703]: E1011 03:56:27.537681 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-z4mh8" podUID="9301ceea-042d-4d8d-941c-54032cf80047" Oct 11 03:56:28 crc kubenswrapper[4703]: I1011 03:56:28.450403 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_cni-sysctl-allowlist-ds-69n87_01618421-523c-44ab-b631-f2624ba8ab2d/kube-multus-additional-cni-plugins/0.log" Oct 11 03:56:28 crc kubenswrapper[4703]: I1011 03:56:28.450745 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-69n87" Oct 11 03:56:28 crc kubenswrapper[4703]: I1011 03:56:28.541656 4703 generic.go:334] "Generic (PLEG): container finished" podID="fd430f8d-f9b2-4b4e-9966-5fd12e03e0df" containerID="589ad8be7fec4de894a0fff73ae85b102a2d944c780d0244360d95ee64aad341" exitCode=0 Oct 11 03:56:28 crc kubenswrapper[4703]: I1011 03:56:28.541750 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vvrdk" event={"ID":"fd430f8d-f9b2-4b4e-9966-5fd12e03e0df","Type":"ContainerDied","Data":"589ad8be7fec4de894a0fff73ae85b102a2d944c780d0244360d95ee64aad341"} Oct 11 03:56:28 crc kubenswrapper[4703]: I1011 03:56:28.577550 4703 generic.go:334] "Generic (PLEG): container finished" podID="27051689-7c44-4841-bcbc-7118a66ae0a0" containerID="ad876d8e6cd6fd289d1cc2b8fdb677c15de192e4e7c5c27db757451a97f5953f" exitCode=0 Oct 11 03:56:28 crc kubenswrapper[4703]: I1011 03:56:28.577708 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-g6lsz" event={"ID":"27051689-7c44-4841-bcbc-7118a66ae0a0","Type":"ContainerDied","Data":"ad876d8e6cd6fd289d1cc2b8fdb677c15de192e4e7c5c27db757451a97f5953f"} Oct 11 03:56:28 crc kubenswrapper[4703]: I1011 03:56:28.584091 4703 generic.go:334] "Generic (PLEG): container finished" podID="629e92e9-ec79-44b4-ab21-7b44609e75b3" containerID="bbd626fc94482026701234d9e79d2a982cc3e1068a21d3a57552dd924b217d9b" exitCode=0 Oct 11 03:56:28 crc kubenswrapper[4703]: I1011 03:56:28.584440 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xxjnd" event={"ID":"629e92e9-ec79-44b4-ab21-7b44609e75b3","Type":"ContainerDied","Data":"bbd626fc94482026701234d9e79d2a982cc3e1068a21d3a57552dd924b217d9b"} Oct 11 03:56:28 crc kubenswrapper[4703]: I1011 03:56:28.591062 4703 generic.go:334] "Generic (PLEG): container finished" podID="9307554f-a116-4c31-9585-c712558479fc" containerID="28a04f02e0b3d54fc35d46f6ee66ea8fff2bae3deda09d586d27358ec4a4a2ce" exitCode=0 Oct 11 03:56:28 crc kubenswrapper[4703]: I1011 03:56:28.591179 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qnqtn" event={"ID":"9307554f-a116-4c31-9585-c712558479fc","Type":"ContainerDied","Data":"28a04f02e0b3d54fc35d46f6ee66ea8fff2bae3deda09d586d27358ec4a4a2ce"} Oct 11 03:56:28 crc kubenswrapper[4703]: I1011 03:56:28.597338 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fc6bj\" (UniqueName: \"kubernetes.io/projected/01618421-523c-44ab-b631-f2624ba8ab2d-kube-api-access-fc6bj\") pod \"01618421-523c-44ab-b631-f2624ba8ab2d\" (UID: \"01618421-523c-44ab-b631-f2624ba8ab2d\") " Oct 11 03:56:28 crc kubenswrapper[4703]: I1011 03:56:28.597400 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/01618421-523c-44ab-b631-f2624ba8ab2d-tuning-conf-dir\") pod \"01618421-523c-44ab-b631-f2624ba8ab2d\" (UID: \"01618421-523c-44ab-b631-f2624ba8ab2d\") " Oct 11 03:56:28 crc kubenswrapper[4703]: I1011 03:56:28.597453 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/01618421-523c-44ab-b631-f2624ba8ab2d-cni-sysctl-allowlist\") pod \"01618421-523c-44ab-b631-f2624ba8ab2d\" (UID: \"01618421-523c-44ab-b631-f2624ba8ab2d\") " Oct 11 03:56:28 crc kubenswrapper[4703]: I1011 03:56:28.597548 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/01618421-523c-44ab-b631-f2624ba8ab2d-ready\") pod \"01618421-523c-44ab-b631-f2624ba8ab2d\" (UID: \"01618421-523c-44ab-b631-f2624ba8ab2d\") " Oct 11 03:56:28 crc kubenswrapper[4703]: I1011 03:56:28.597841 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_cni-sysctl-allowlist-ds-69n87_01618421-523c-44ab-b631-f2624ba8ab2d/kube-multus-additional-cni-plugins/0.log" Oct 11 03:56:28 crc kubenswrapper[4703]: I1011 03:56:28.597916 4703 generic.go:334] "Generic (PLEG): container finished" podID="01618421-523c-44ab-b631-f2624ba8ab2d" containerID="a40d99f678ccb88a8e3c7876ce75ddc6fa397760b26ff87effa4a1ed05d7f35b" exitCode=137 Oct 11 03:56:28 crc kubenswrapper[4703]: I1011 03:56:28.597971 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-69n87" event={"ID":"01618421-523c-44ab-b631-f2624ba8ab2d","Type":"ContainerDied","Data":"a40d99f678ccb88a8e3c7876ce75ddc6fa397760b26ff87effa4a1ed05d7f35b"} Oct 11 03:56:28 crc kubenswrapper[4703]: I1011 03:56:28.598021 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-69n87" event={"ID":"01618421-523c-44ab-b631-f2624ba8ab2d","Type":"ContainerDied","Data":"2a712460999be14f5cfe2b01b79b3090e05fa72aaeb2260e094e7e3f2016bb01"} Oct 11 03:56:28 crc kubenswrapper[4703]: I1011 03:56:28.598079 4703 scope.go:117] "RemoveContainer" containerID="a40d99f678ccb88a8e3c7876ce75ddc6fa397760b26ff87effa4a1ed05d7f35b" Oct 11 03:56:28 crc kubenswrapper[4703]: I1011 03:56:28.598373 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-69n87" Oct 11 03:56:28 crc kubenswrapper[4703]: I1011 03:56:28.598642 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/01618421-523c-44ab-b631-f2624ba8ab2d-tuning-conf-dir" (OuterVolumeSpecName: "tuning-conf-dir") pod "01618421-523c-44ab-b631-f2624ba8ab2d" (UID: "01618421-523c-44ab-b631-f2624ba8ab2d"). InnerVolumeSpecName "tuning-conf-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 11 03:56:28 crc kubenswrapper[4703]: I1011 03:56:28.598665 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/01618421-523c-44ab-b631-f2624ba8ab2d-ready" (OuterVolumeSpecName: "ready") pod "01618421-523c-44ab-b631-f2624ba8ab2d" (UID: "01618421-523c-44ab-b631-f2624ba8ab2d"). InnerVolumeSpecName "ready". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 03:56:28 crc kubenswrapper[4703]: I1011 03:56:28.600119 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01618421-523c-44ab-b631-f2624ba8ab2d-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "01618421-523c-44ab-b631-f2624ba8ab2d" (UID: "01618421-523c-44ab-b631-f2624ba8ab2d"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 03:56:28 crc kubenswrapper[4703]: I1011 03:56:28.615210 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01618421-523c-44ab-b631-f2624ba8ab2d-kube-api-access-fc6bj" (OuterVolumeSpecName: "kube-api-access-fc6bj") pod "01618421-523c-44ab-b631-f2624ba8ab2d" (UID: "01618421-523c-44ab-b631-f2624ba8ab2d"). InnerVolumeSpecName "kube-api-access-fc6bj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 03:56:28 crc kubenswrapper[4703]: I1011 03:56:28.640683 4703 scope.go:117] "RemoveContainer" containerID="a40d99f678ccb88a8e3c7876ce75ddc6fa397760b26ff87effa4a1ed05d7f35b" Oct 11 03:56:28 crc kubenswrapper[4703]: E1011 03:56:28.641453 4703 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a40d99f678ccb88a8e3c7876ce75ddc6fa397760b26ff87effa4a1ed05d7f35b\": container with ID starting with a40d99f678ccb88a8e3c7876ce75ddc6fa397760b26ff87effa4a1ed05d7f35b not found: ID does not exist" containerID="a40d99f678ccb88a8e3c7876ce75ddc6fa397760b26ff87effa4a1ed05d7f35b" Oct 11 03:56:28 crc kubenswrapper[4703]: I1011 03:56:28.641532 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a40d99f678ccb88a8e3c7876ce75ddc6fa397760b26ff87effa4a1ed05d7f35b"} err="failed to get container status \"a40d99f678ccb88a8e3c7876ce75ddc6fa397760b26ff87effa4a1ed05d7f35b\": rpc error: code = NotFound desc = could not find container \"a40d99f678ccb88a8e3c7876ce75ddc6fa397760b26ff87effa4a1ed05d7f35b\": container with ID starting with a40d99f678ccb88a8e3c7876ce75ddc6fa397760b26ff87effa4a1ed05d7f35b not found: ID does not exist" Oct 11 03:56:28 crc kubenswrapper[4703]: I1011 03:56:28.699088 4703 reconciler_common.go:293] "Volume detached for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/01618421-523c-44ab-b631-f2624ba8ab2d-tuning-conf-dir\") on node \"crc\" DevicePath \"\"" Oct 11 03:56:28 crc kubenswrapper[4703]: I1011 03:56:28.701750 4703 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/01618421-523c-44ab-b631-f2624ba8ab2d-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Oct 11 03:56:28 crc kubenswrapper[4703]: I1011 03:56:28.702167 4703 reconciler_common.go:293] "Volume detached for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/01618421-523c-44ab-b631-f2624ba8ab2d-ready\") on node \"crc\" DevicePath \"\"" Oct 11 03:56:28 crc kubenswrapper[4703]: I1011 03:56:28.702265 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fc6bj\" (UniqueName: \"kubernetes.io/projected/01618421-523c-44ab-b631-f2624ba8ab2d-kube-api-access-fc6bj\") on node \"crc\" DevicePath \"\"" Oct 11 03:56:28 crc kubenswrapper[4703]: I1011 03:56:28.954484 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-69n87"] Oct 11 03:56:28 crc kubenswrapper[4703]: I1011 03:56:28.960010 4703 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-69n87"] Oct 11 03:56:29 crc kubenswrapper[4703]: I1011 03:56:29.552550 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01618421-523c-44ab-b631-f2624ba8ab2d" path="/var/lib/kubelet/pods/01618421-523c-44ab-b631-f2624ba8ab2d/volumes" Oct 11 03:56:29 crc kubenswrapper[4703]: I1011 03:56:29.608042 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xxjnd" event={"ID":"629e92e9-ec79-44b4-ab21-7b44609e75b3","Type":"ContainerStarted","Data":"79e7b627d80ee108b2b1d9fda2b4acb53609c34bd5a431fb94d97edfdc75e624"} Oct 11 03:56:29 crc kubenswrapper[4703]: I1011 03:56:29.611285 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qnqtn" event={"ID":"9307554f-a116-4c31-9585-c712558479fc","Type":"ContainerStarted","Data":"82274d9f2d29211be5e19353b40f52113c140b6f60d8ca4a92bd9505e648d6d3"} Oct 11 03:56:29 crc kubenswrapper[4703]: I1011 03:56:29.616238 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vvrdk" event={"ID":"fd430f8d-f9b2-4b4e-9966-5fd12e03e0df","Type":"ContainerStarted","Data":"ea9190a04f18bc75279a12b9e4ed4d1b5a573bb98315babd87cd0a1c66729a31"} Oct 11 03:56:29 crc kubenswrapper[4703]: I1011 03:56:29.623666 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-g6lsz" event={"ID":"27051689-7c44-4841-bcbc-7118a66ae0a0","Type":"ContainerStarted","Data":"f68ff242292996fab9da861be6260892926aea6c943e6117400696b059a4d3fb"} Oct 11 03:56:29 crc kubenswrapper[4703]: I1011 03:56:29.634132 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-xxjnd" podStartSLOduration=3.637154604 podStartE2EDuration="33.634107979s" podCreationTimestamp="2025-10-11 03:55:56 +0000 UTC" firstStartedPulling="2025-10-11 03:55:59.012824406 +0000 UTC m=+50.223306328" lastFinishedPulling="2025-10-11 03:56:29.009777781 +0000 UTC m=+80.220259703" observedRunningTime="2025-10-11 03:56:29.633864152 +0000 UTC m=+80.844346084" watchObservedRunningTime="2025-10-11 03:56:29.634107979 +0000 UTC m=+80.844589891" Oct 11 03:56:29 crc kubenswrapper[4703]: I1011 03:56:29.669558 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-vvrdk" Oct 11 03:56:29 crc kubenswrapper[4703]: I1011 03:56:29.669639 4703 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-vvrdk" Oct 11 03:56:29 crc kubenswrapper[4703]: I1011 03:56:29.691815 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-g6lsz" podStartSLOduration=3.686330855 podStartE2EDuration="33.691788699s" podCreationTimestamp="2025-10-11 03:55:56 +0000 UTC" firstStartedPulling="2025-10-11 03:55:58.988276283 +0000 UTC m=+50.198758195" lastFinishedPulling="2025-10-11 03:56:28.993734117 +0000 UTC m=+80.204216039" observedRunningTime="2025-10-11 03:56:29.687024847 +0000 UTC m=+80.897506769" watchObservedRunningTime="2025-10-11 03:56:29.691788699 +0000 UTC m=+80.902270621" Oct 11 03:56:29 crc kubenswrapper[4703]: I1011 03:56:29.708216 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-vvrdk" podStartSLOduration=2.902476616 podStartE2EDuration="30.708194124s" podCreationTimestamp="2025-10-11 03:55:59 +0000 UTC" firstStartedPulling="2025-10-11 03:56:01.242414747 +0000 UTC m=+52.452896669" lastFinishedPulling="2025-10-11 03:56:29.048132255 +0000 UTC m=+80.258614177" observedRunningTime="2025-10-11 03:56:29.703818203 +0000 UTC m=+80.914300125" watchObservedRunningTime="2025-10-11 03:56:29.708194124 +0000 UTC m=+80.918676046" Oct 11 03:56:29 crc kubenswrapper[4703]: I1011 03:56:29.731653 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-qnqtn" podStartSLOduration=2.813096572 podStartE2EDuration="31.731634884s" podCreationTimestamp="2025-10-11 03:55:58 +0000 UTC" firstStartedPulling="2025-10-11 03:56:00.182372737 +0000 UTC m=+51.392854659" lastFinishedPulling="2025-10-11 03:56:29.100911049 +0000 UTC m=+80.311392971" observedRunningTime="2025-10-11 03:56:29.729220627 +0000 UTC m=+80.939702569" watchObservedRunningTime="2025-10-11 03:56:29.731634884 +0000 UTC m=+80.942116826" Oct 11 03:56:30 crc kubenswrapper[4703]: I1011 03:56:30.841243 4703 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-vvrdk" podUID="fd430f8d-f9b2-4b4e-9966-5fd12e03e0df" containerName="registry-server" probeResult="failure" output=< Oct 11 03:56:30 crc kubenswrapper[4703]: timeout: failed to connect service ":50051" within 1s Oct 11 03:56:30 crc kubenswrapper[4703]: > Oct 11 03:56:32 crc kubenswrapper[4703]: I1011 03:56:32.078753 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-75knh" Oct 11 03:56:37 crc kubenswrapper[4703]: I1011 03:56:37.187022 4703 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-g6lsz" Oct 11 03:56:37 crc kubenswrapper[4703]: I1011 03:56:37.187800 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-g6lsz" Oct 11 03:56:37 crc kubenswrapper[4703]: I1011 03:56:37.256830 4703 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-g6lsz" Oct 11 03:56:37 crc kubenswrapper[4703]: I1011 03:56:37.334062 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-xxjnd" Oct 11 03:56:37 crc kubenswrapper[4703]: I1011 03:56:37.334129 4703 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-xxjnd" Oct 11 03:56:37 crc kubenswrapper[4703]: I1011 03:56:37.400766 4703 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-xxjnd" Oct 11 03:56:37 crc kubenswrapper[4703]: I1011 03:56:37.732158 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-g6lsz" Oct 11 03:56:37 crc kubenswrapper[4703]: I1011 03:56:37.741099 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-xxjnd" Oct 11 03:56:38 crc kubenswrapper[4703]: I1011 03:56:38.502347 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-xxjnd"] Oct 11 03:56:39 crc kubenswrapper[4703]: I1011 03:56:39.245854 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-qnqtn" Oct 11 03:56:39 crc kubenswrapper[4703]: I1011 03:56:39.245918 4703 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-qnqtn" Oct 11 03:56:39 crc kubenswrapper[4703]: I1011 03:56:39.313029 4703 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-qnqtn" Oct 11 03:56:39 crc kubenswrapper[4703]: I1011 03:56:39.691195 4703 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-xxjnd" podUID="629e92e9-ec79-44b4-ab21-7b44609e75b3" containerName="registry-server" containerID="cri-o://79e7b627d80ee108b2b1d9fda2b4acb53609c34bd5a431fb94d97edfdc75e624" gracePeriod=2 Oct 11 03:56:39 crc kubenswrapper[4703]: I1011 03:56:39.714290 4703 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-vvrdk" Oct 11 03:56:39 crc kubenswrapper[4703]: I1011 03:56:39.760367 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-qnqtn" Oct 11 03:56:39 crc kubenswrapper[4703]: I1011 03:56:39.778926 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-vvrdk" Oct 11 03:56:40 crc kubenswrapper[4703]: I1011 03:56:40.492440 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 11 03:56:40 crc kubenswrapper[4703]: I1011 03:56:40.697489 4703 generic.go:334] "Generic (PLEG): container finished" podID="629e92e9-ec79-44b4-ab21-7b44609e75b3" containerID="79e7b627d80ee108b2b1d9fda2b4acb53609c34bd5a431fb94d97edfdc75e624" exitCode=0 Oct 11 03:56:40 crc kubenswrapper[4703]: I1011 03:56:40.697564 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xxjnd" event={"ID":"629e92e9-ec79-44b4-ab21-7b44609e75b3","Type":"ContainerDied","Data":"79e7b627d80ee108b2b1d9fda2b4acb53609c34bd5a431fb94d97edfdc75e624"} Oct 11 03:56:41 crc kubenswrapper[4703]: I1011 03:56:41.559349 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xxjnd" Oct 11 03:56:41 crc kubenswrapper[4703]: I1011 03:56:41.599325 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/629e92e9-ec79-44b4-ab21-7b44609e75b3-catalog-content\") pod \"629e92e9-ec79-44b4-ab21-7b44609e75b3\" (UID: \"629e92e9-ec79-44b4-ab21-7b44609e75b3\") " Oct 11 03:56:41 crc kubenswrapper[4703]: I1011 03:56:41.599393 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/629e92e9-ec79-44b4-ab21-7b44609e75b3-utilities\") pod \"629e92e9-ec79-44b4-ab21-7b44609e75b3\" (UID: \"629e92e9-ec79-44b4-ab21-7b44609e75b3\") " Oct 11 03:56:41 crc kubenswrapper[4703]: I1011 03:56:41.599430 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m4rrv\" (UniqueName: \"kubernetes.io/projected/629e92e9-ec79-44b4-ab21-7b44609e75b3-kube-api-access-m4rrv\") pod \"629e92e9-ec79-44b4-ab21-7b44609e75b3\" (UID: \"629e92e9-ec79-44b4-ab21-7b44609e75b3\") " Oct 11 03:56:41 crc kubenswrapper[4703]: I1011 03:56:41.600724 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/629e92e9-ec79-44b4-ab21-7b44609e75b3-utilities" (OuterVolumeSpecName: "utilities") pod "629e92e9-ec79-44b4-ab21-7b44609e75b3" (UID: "629e92e9-ec79-44b4-ab21-7b44609e75b3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 03:56:41 crc kubenswrapper[4703]: I1011 03:56:41.605328 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/629e92e9-ec79-44b4-ab21-7b44609e75b3-kube-api-access-m4rrv" (OuterVolumeSpecName: "kube-api-access-m4rrv") pod "629e92e9-ec79-44b4-ab21-7b44609e75b3" (UID: "629e92e9-ec79-44b4-ab21-7b44609e75b3"). InnerVolumeSpecName "kube-api-access-m4rrv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 03:56:41 crc kubenswrapper[4703]: I1011 03:56:41.661261 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/629e92e9-ec79-44b4-ab21-7b44609e75b3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "629e92e9-ec79-44b4-ab21-7b44609e75b3" (UID: "629e92e9-ec79-44b4-ab21-7b44609e75b3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 03:56:41 crc kubenswrapper[4703]: I1011 03:56:41.696964 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-vvrdk"] Oct 11 03:56:41 crc kubenswrapper[4703]: I1011 03:56:41.700133 4703 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/629e92e9-ec79-44b4-ab21-7b44609e75b3-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 11 03:56:41 crc kubenswrapper[4703]: I1011 03:56:41.701066 4703 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/629e92e9-ec79-44b4-ab21-7b44609e75b3-utilities\") on node \"crc\" DevicePath \"\"" Oct 11 03:56:41 crc kubenswrapper[4703]: I1011 03:56:41.701133 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m4rrv\" (UniqueName: \"kubernetes.io/projected/629e92e9-ec79-44b4-ab21-7b44609e75b3-kube-api-access-m4rrv\") on node \"crc\" DevicePath \"\"" Oct 11 03:56:41 crc kubenswrapper[4703]: I1011 03:56:41.710619 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n5sn7" event={"ID":"d85389b5-5e4e-4939-8f96-902c5a3ed9e2","Type":"ContainerStarted","Data":"79f7936089ee3ef033f2ac88ab30434456328c92ba0955feea205f3979324a28"} Oct 11 03:56:41 crc kubenswrapper[4703]: I1011 03:56:41.712824 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xxjnd" event={"ID":"629e92e9-ec79-44b4-ab21-7b44609e75b3","Type":"ContainerDied","Data":"ec62bf50d68ec1b29268cd2d88307bdd59083339b1c0586ff77eee146f9b74f3"} Oct 11 03:56:41 crc kubenswrapper[4703]: I1011 03:56:41.712957 4703 scope.go:117] "RemoveContainer" containerID="79e7b627d80ee108b2b1d9fda2b4acb53609c34bd5a431fb94d97edfdc75e624" Oct 11 03:56:41 crc kubenswrapper[4703]: I1011 03:56:41.712843 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xxjnd" Oct 11 03:56:41 crc kubenswrapper[4703]: I1011 03:56:41.715825 4703 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-vvrdk" podUID="fd430f8d-f9b2-4b4e-9966-5fd12e03e0df" containerName="registry-server" containerID="cri-o://ea9190a04f18bc75279a12b9e4ed4d1b5a573bb98315babd87cd0a1c66729a31" gracePeriod=2 Oct 11 03:56:41 crc kubenswrapper[4703]: I1011 03:56:41.716169 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v74rb" event={"ID":"fa264969-12b7-415a-a569-0f2cab5d3a30","Type":"ContainerStarted","Data":"33ece0de46ab457deb5f6692c79a83433693020941ad73570bd19164f23ac92f"} Oct 11 03:56:41 crc kubenswrapper[4703]: I1011 03:56:41.749088 4703 scope.go:117] "RemoveContainer" containerID="bbd626fc94482026701234d9e79d2a982cc3e1068a21d3a57552dd924b217d9b" Oct 11 03:56:41 crc kubenswrapper[4703]: I1011 03:56:41.785415 4703 scope.go:117] "RemoveContainer" containerID="aac92bdff7b7700a95e190c20008ea30efcd1030892d0aa709dd83d7cd99ba2a" Oct 11 03:56:41 crc kubenswrapper[4703]: I1011 03:56:41.796491 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-xxjnd"] Oct 11 03:56:41 crc kubenswrapper[4703]: I1011 03:56:41.799980 4703 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-xxjnd"] Oct 11 03:56:42 crc kubenswrapper[4703]: I1011 03:56:42.131723 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vvrdk" Oct 11 03:56:42 crc kubenswrapper[4703]: I1011 03:56:42.310742 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fd430f8d-f9b2-4b4e-9966-5fd12e03e0df-utilities\") pod \"fd430f8d-f9b2-4b4e-9966-5fd12e03e0df\" (UID: \"fd430f8d-f9b2-4b4e-9966-5fd12e03e0df\") " Oct 11 03:56:42 crc kubenswrapper[4703]: I1011 03:56:42.311145 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vr2n2\" (UniqueName: \"kubernetes.io/projected/fd430f8d-f9b2-4b4e-9966-5fd12e03e0df-kube-api-access-vr2n2\") pod \"fd430f8d-f9b2-4b4e-9966-5fd12e03e0df\" (UID: \"fd430f8d-f9b2-4b4e-9966-5fd12e03e0df\") " Oct 11 03:56:42 crc kubenswrapper[4703]: I1011 03:56:42.311168 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fd430f8d-f9b2-4b4e-9966-5fd12e03e0df-catalog-content\") pod \"fd430f8d-f9b2-4b4e-9966-5fd12e03e0df\" (UID: \"fd430f8d-f9b2-4b4e-9966-5fd12e03e0df\") " Oct 11 03:56:42 crc kubenswrapper[4703]: I1011 03:56:42.312005 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fd430f8d-f9b2-4b4e-9966-5fd12e03e0df-utilities" (OuterVolumeSpecName: "utilities") pod "fd430f8d-f9b2-4b4e-9966-5fd12e03e0df" (UID: "fd430f8d-f9b2-4b4e-9966-5fd12e03e0df"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 03:56:42 crc kubenswrapper[4703]: I1011 03:56:42.319007 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fd430f8d-f9b2-4b4e-9966-5fd12e03e0df-kube-api-access-vr2n2" (OuterVolumeSpecName: "kube-api-access-vr2n2") pod "fd430f8d-f9b2-4b4e-9966-5fd12e03e0df" (UID: "fd430f8d-f9b2-4b4e-9966-5fd12e03e0df"). InnerVolumeSpecName "kube-api-access-vr2n2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 03:56:42 crc kubenswrapper[4703]: I1011 03:56:42.326238 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fd430f8d-f9b2-4b4e-9966-5fd12e03e0df-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fd430f8d-f9b2-4b4e-9966-5fd12e03e0df" (UID: "fd430f8d-f9b2-4b4e-9966-5fd12e03e0df"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 03:56:42 crc kubenswrapper[4703]: I1011 03:56:42.412096 4703 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fd430f8d-f9b2-4b4e-9966-5fd12e03e0df-utilities\") on node \"crc\" DevicePath \"\"" Oct 11 03:56:42 crc kubenswrapper[4703]: I1011 03:56:42.412132 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vr2n2\" (UniqueName: \"kubernetes.io/projected/fd430f8d-f9b2-4b4e-9966-5fd12e03e0df-kube-api-access-vr2n2\") on node \"crc\" DevicePath \"\"" Oct 11 03:56:42 crc kubenswrapper[4703]: I1011 03:56:42.412143 4703 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fd430f8d-f9b2-4b4e-9966-5fd12e03e0df-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 11 03:56:42 crc kubenswrapper[4703]: I1011 03:56:42.727400 4703 generic.go:334] "Generic (PLEG): container finished" podID="d85389b5-5e4e-4939-8f96-902c5a3ed9e2" containerID="79f7936089ee3ef033f2ac88ab30434456328c92ba0955feea205f3979324a28" exitCode=0 Oct 11 03:56:42 crc kubenswrapper[4703]: I1011 03:56:42.727514 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n5sn7" event={"ID":"d85389b5-5e4e-4939-8f96-902c5a3ed9e2","Type":"ContainerDied","Data":"79f7936089ee3ef033f2ac88ab30434456328c92ba0955feea205f3979324a28"} Oct 11 03:56:42 crc kubenswrapper[4703]: I1011 03:56:42.735322 4703 generic.go:334] "Generic (PLEG): container finished" podID="fa264969-12b7-415a-a569-0f2cab5d3a30" containerID="33ece0de46ab457deb5f6692c79a83433693020941ad73570bd19164f23ac92f" exitCode=0 Oct 11 03:56:42 crc kubenswrapper[4703]: I1011 03:56:42.735415 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v74rb" event={"ID":"fa264969-12b7-415a-a569-0f2cab5d3a30","Type":"ContainerDied","Data":"33ece0de46ab457deb5f6692c79a83433693020941ad73570bd19164f23ac92f"} Oct 11 03:56:42 crc kubenswrapper[4703]: I1011 03:56:42.741425 4703 generic.go:334] "Generic (PLEG): container finished" podID="7dfc7ef5-c10c-4282-9857-ad5a2fb6e356" containerID="185f35149d04573e318d19fbd650119f12a5e2c6fbfda478165f81a37a819572" exitCode=0 Oct 11 03:56:42 crc kubenswrapper[4703]: I1011 03:56:42.741531 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wm7q9" event={"ID":"7dfc7ef5-c10c-4282-9857-ad5a2fb6e356","Type":"ContainerDied","Data":"185f35149d04573e318d19fbd650119f12a5e2c6fbfda478165f81a37a819572"} Oct 11 03:56:42 crc kubenswrapper[4703]: I1011 03:56:42.746010 4703 generic.go:334] "Generic (PLEG): container finished" podID="fd430f8d-f9b2-4b4e-9966-5fd12e03e0df" containerID="ea9190a04f18bc75279a12b9e4ed4d1b5a573bb98315babd87cd0a1c66729a31" exitCode=0 Oct 11 03:56:42 crc kubenswrapper[4703]: I1011 03:56:42.746056 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vvrdk" event={"ID":"fd430f8d-f9b2-4b4e-9966-5fd12e03e0df","Type":"ContainerDied","Data":"ea9190a04f18bc75279a12b9e4ed4d1b5a573bb98315babd87cd0a1c66729a31"} Oct 11 03:56:42 crc kubenswrapper[4703]: I1011 03:56:42.746121 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vvrdk" event={"ID":"fd430f8d-f9b2-4b4e-9966-5fd12e03e0df","Type":"ContainerDied","Data":"9b1977fa31bb3f56043fd0de3e1541bfc7189b16e226297b642ff1fd43381ecd"} Oct 11 03:56:42 crc kubenswrapper[4703]: I1011 03:56:42.746155 4703 scope.go:117] "RemoveContainer" containerID="ea9190a04f18bc75279a12b9e4ed4d1b5a573bb98315babd87cd0a1c66729a31" Oct 11 03:56:42 crc kubenswrapper[4703]: I1011 03:56:42.746253 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vvrdk" Oct 11 03:56:42 crc kubenswrapper[4703]: I1011 03:56:42.784983 4703 scope.go:117] "RemoveContainer" containerID="589ad8be7fec4de894a0fff73ae85b102a2d944c780d0244360d95ee64aad341" Oct 11 03:56:42 crc kubenswrapper[4703]: I1011 03:56:42.840726 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-vvrdk"] Oct 11 03:56:42 crc kubenswrapper[4703]: I1011 03:56:42.844278 4703 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-vvrdk"] Oct 11 03:56:42 crc kubenswrapper[4703]: I1011 03:56:42.848870 4703 scope.go:117] "RemoveContainer" containerID="4e075e38853675e2a56ba0339dca4d8729e9e04415f7da377ccb747be235d7f5" Oct 11 03:56:42 crc kubenswrapper[4703]: I1011 03:56:42.875801 4703 scope.go:117] "RemoveContainer" containerID="ea9190a04f18bc75279a12b9e4ed4d1b5a573bb98315babd87cd0a1c66729a31" Oct 11 03:56:42 crc kubenswrapper[4703]: E1011 03:56:42.876378 4703 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ea9190a04f18bc75279a12b9e4ed4d1b5a573bb98315babd87cd0a1c66729a31\": container with ID starting with ea9190a04f18bc75279a12b9e4ed4d1b5a573bb98315babd87cd0a1c66729a31 not found: ID does not exist" containerID="ea9190a04f18bc75279a12b9e4ed4d1b5a573bb98315babd87cd0a1c66729a31" Oct 11 03:56:42 crc kubenswrapper[4703]: I1011 03:56:42.876432 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea9190a04f18bc75279a12b9e4ed4d1b5a573bb98315babd87cd0a1c66729a31"} err="failed to get container status \"ea9190a04f18bc75279a12b9e4ed4d1b5a573bb98315babd87cd0a1c66729a31\": rpc error: code = NotFound desc = could not find container \"ea9190a04f18bc75279a12b9e4ed4d1b5a573bb98315babd87cd0a1c66729a31\": container with ID starting with ea9190a04f18bc75279a12b9e4ed4d1b5a573bb98315babd87cd0a1c66729a31 not found: ID does not exist" Oct 11 03:56:42 crc kubenswrapper[4703]: I1011 03:56:42.876492 4703 scope.go:117] "RemoveContainer" containerID="589ad8be7fec4de894a0fff73ae85b102a2d944c780d0244360d95ee64aad341" Oct 11 03:56:42 crc kubenswrapper[4703]: E1011 03:56:42.877051 4703 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"589ad8be7fec4de894a0fff73ae85b102a2d944c780d0244360d95ee64aad341\": container with ID starting with 589ad8be7fec4de894a0fff73ae85b102a2d944c780d0244360d95ee64aad341 not found: ID does not exist" containerID="589ad8be7fec4de894a0fff73ae85b102a2d944c780d0244360d95ee64aad341" Oct 11 03:56:42 crc kubenswrapper[4703]: I1011 03:56:42.877087 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"589ad8be7fec4de894a0fff73ae85b102a2d944c780d0244360d95ee64aad341"} err="failed to get container status \"589ad8be7fec4de894a0fff73ae85b102a2d944c780d0244360d95ee64aad341\": rpc error: code = NotFound desc = could not find container \"589ad8be7fec4de894a0fff73ae85b102a2d944c780d0244360d95ee64aad341\": container with ID starting with 589ad8be7fec4de894a0fff73ae85b102a2d944c780d0244360d95ee64aad341 not found: ID does not exist" Oct 11 03:56:42 crc kubenswrapper[4703]: I1011 03:56:42.877133 4703 scope.go:117] "RemoveContainer" containerID="4e075e38853675e2a56ba0339dca4d8729e9e04415f7da377ccb747be235d7f5" Oct 11 03:56:42 crc kubenswrapper[4703]: E1011 03:56:42.877591 4703 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4e075e38853675e2a56ba0339dca4d8729e9e04415f7da377ccb747be235d7f5\": container with ID starting with 4e075e38853675e2a56ba0339dca4d8729e9e04415f7da377ccb747be235d7f5 not found: ID does not exist" containerID="4e075e38853675e2a56ba0339dca4d8729e9e04415f7da377ccb747be235d7f5" Oct 11 03:56:42 crc kubenswrapper[4703]: I1011 03:56:42.877637 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e075e38853675e2a56ba0339dca4d8729e9e04415f7da377ccb747be235d7f5"} err="failed to get container status \"4e075e38853675e2a56ba0339dca4d8729e9e04415f7da377ccb747be235d7f5\": rpc error: code = NotFound desc = could not find container \"4e075e38853675e2a56ba0339dca4d8729e9e04415f7da377ccb747be235d7f5\": container with ID starting with 4e075e38853675e2a56ba0339dca4d8729e9e04415f7da377ccb747be235d7f5 not found: ID does not exist" Oct 11 03:56:43 crc kubenswrapper[4703]: I1011 03:56:43.542364 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="629e92e9-ec79-44b4-ab21-7b44609e75b3" path="/var/lib/kubelet/pods/629e92e9-ec79-44b4-ab21-7b44609e75b3/volumes" Oct 11 03:56:43 crc kubenswrapper[4703]: I1011 03:56:43.543902 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fd430f8d-f9b2-4b4e-9966-5fd12e03e0df" path="/var/lib/kubelet/pods/fd430f8d-f9b2-4b4e-9966-5fd12e03e0df/volumes" Oct 11 03:56:43 crc kubenswrapper[4703]: I1011 03:56:43.765687 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-z4mh8" event={"ID":"9301ceea-042d-4d8d-941c-54032cf80047","Type":"ContainerStarted","Data":"eafc0e378d62f16634572dbc920a814486e8475f3588700a976f08e71c2c0a34"} Oct 11 03:56:43 crc kubenswrapper[4703]: I1011 03:56:43.769483 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v74rb" event={"ID":"fa264969-12b7-415a-a569-0f2cab5d3a30","Type":"ContainerStarted","Data":"f20f7554a84622b68cc2fe762ed454e175607ea68f10958ae1ca22bde4a0e0bb"} Oct 11 03:56:43 crc kubenswrapper[4703]: I1011 03:56:43.778456 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wm7q9" event={"ID":"7dfc7ef5-c10c-4282-9857-ad5a2fb6e356","Type":"ContainerStarted","Data":"dd01e7dd62568a89cd5af117fe12d7121138ea5da2fa4c50a741536ec675268e"} Oct 11 03:56:43 crc kubenswrapper[4703]: I1011 03:56:43.785147 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n5sn7" event={"ID":"d85389b5-5e4e-4939-8f96-902c5a3ed9e2","Type":"ContainerStarted","Data":"ba047b40ea2fdd48079883611e6396257697c20ce640ee75a7ffeeee88c257dc"} Oct 11 03:56:43 crc kubenswrapper[4703]: I1011 03:56:43.804106 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-v74rb" podStartSLOduration=3.790286723 podStartE2EDuration="44.804083062s" podCreationTimestamp="2025-10-11 03:55:59 +0000 UTC" firstStartedPulling="2025-10-11 03:56:02.27395171 +0000 UTC m=+53.484433632" lastFinishedPulling="2025-10-11 03:56:43.287748049 +0000 UTC m=+94.498229971" observedRunningTime="2025-10-11 03:56:43.801590392 +0000 UTC m=+95.012072314" watchObservedRunningTime="2025-10-11 03:56:43.804083062 +0000 UTC m=+95.014564994" Oct 11 03:56:43 crc kubenswrapper[4703]: I1011 03:56:43.823725 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-wm7q9" podStartSLOduration=2.651779334 podStartE2EDuration="46.823699966s" podCreationTimestamp="2025-10-11 03:55:57 +0000 UTC" firstStartedPulling="2025-10-11 03:55:59.086615178 +0000 UTC m=+50.297097100" lastFinishedPulling="2025-10-11 03:56:43.25853581 +0000 UTC m=+94.469017732" observedRunningTime="2025-10-11 03:56:43.823441059 +0000 UTC m=+95.033922981" watchObservedRunningTime="2025-10-11 03:56:43.823699966 +0000 UTC m=+95.034181888" Oct 11 03:56:43 crc kubenswrapper[4703]: I1011 03:56:43.843222 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-n5sn7" podStartSLOduration=2.914016312 podStartE2EDuration="44.843202777s" podCreationTimestamp="2025-10-11 03:55:59 +0000 UTC" firstStartedPulling="2025-10-11 03:56:01.254007935 +0000 UTC m=+52.464489857" lastFinishedPulling="2025-10-11 03:56:43.1831944 +0000 UTC m=+94.393676322" observedRunningTime="2025-10-11 03:56:43.840494502 +0000 UTC m=+95.050976424" watchObservedRunningTime="2025-10-11 03:56:43.843202777 +0000 UTC m=+95.053684699" Oct 11 03:56:44 crc kubenswrapper[4703]: I1011 03:56:44.795828 4703 generic.go:334] "Generic (PLEG): container finished" podID="9301ceea-042d-4d8d-941c-54032cf80047" containerID="eafc0e378d62f16634572dbc920a814486e8475f3588700a976f08e71c2c0a34" exitCode=0 Oct 11 03:56:44 crc kubenswrapper[4703]: I1011 03:56:44.795906 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-z4mh8" event={"ID":"9301ceea-042d-4d8d-941c-54032cf80047","Type":"ContainerDied","Data":"eafc0e378d62f16634572dbc920a814486e8475f3588700a976f08e71c2c0a34"} Oct 11 03:56:47 crc kubenswrapper[4703]: I1011 03:56:47.548796 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-wm7q9" Oct 11 03:56:47 crc kubenswrapper[4703]: I1011 03:56:47.549373 4703 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-wm7q9" Oct 11 03:56:47 crc kubenswrapper[4703]: I1011 03:56:47.618942 4703 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-wm7q9" Oct 11 03:56:47 crc kubenswrapper[4703]: I1011 03:56:47.816751 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-z4mh8" event={"ID":"9301ceea-042d-4d8d-941c-54032cf80047","Type":"ContainerStarted","Data":"344e22c5362db32f5b336956b7fe988306536be9b2199dd5854ec9b215f291df"} Oct 11 03:56:47 crc kubenswrapper[4703]: I1011 03:56:47.839303 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-z4mh8" podStartSLOduration=4.448341561 podStartE2EDuration="51.839286309s" podCreationTimestamp="2025-10-11 03:55:56 +0000 UTC" firstStartedPulling="2025-10-11 03:55:59.005877151 +0000 UTC m=+50.216359073" lastFinishedPulling="2025-10-11 03:56:46.396821889 +0000 UTC m=+97.607303821" observedRunningTime="2025-10-11 03:56:47.83603336 +0000 UTC m=+99.046515282" watchObservedRunningTime="2025-10-11 03:56:47.839286309 +0000 UTC m=+99.049768221" Oct 11 03:56:50 crc kubenswrapper[4703]: I1011 03:56:50.075046 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-n5sn7" Oct 11 03:56:50 crc kubenswrapper[4703]: I1011 03:56:50.075458 4703 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-n5sn7" Oct 11 03:56:50 crc kubenswrapper[4703]: I1011 03:56:50.153377 4703 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-n5sn7" Oct 11 03:56:50 crc kubenswrapper[4703]: I1011 03:56:50.323006 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-v74rb" Oct 11 03:56:50 crc kubenswrapper[4703]: I1011 03:56:50.323093 4703 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-v74rb" Oct 11 03:56:50 crc kubenswrapper[4703]: I1011 03:56:50.393501 4703 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-v74rb" Oct 11 03:56:50 crc kubenswrapper[4703]: I1011 03:56:50.884072 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-n5sn7" Oct 11 03:56:50 crc kubenswrapper[4703]: I1011 03:56:50.907975 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-v74rb" Oct 11 03:56:53 crc kubenswrapper[4703]: I1011 03:56:53.104810 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-v74rb"] Oct 11 03:56:53 crc kubenswrapper[4703]: I1011 03:56:53.105910 4703 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-v74rb" podUID="fa264969-12b7-415a-a569-0f2cab5d3a30" containerName="registry-server" containerID="cri-o://f20f7554a84622b68cc2fe762ed454e175607ea68f10958ae1ca22bde4a0e0bb" gracePeriod=2 Oct 11 03:56:53 crc kubenswrapper[4703]: I1011 03:56:53.514818 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-v74rb" Oct 11 03:56:53 crc kubenswrapper[4703]: I1011 03:56:53.670144 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ss7pb\" (UniqueName: \"kubernetes.io/projected/fa264969-12b7-415a-a569-0f2cab5d3a30-kube-api-access-ss7pb\") pod \"fa264969-12b7-415a-a569-0f2cab5d3a30\" (UID: \"fa264969-12b7-415a-a569-0f2cab5d3a30\") " Oct 11 03:56:53 crc kubenswrapper[4703]: I1011 03:56:53.670311 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa264969-12b7-415a-a569-0f2cab5d3a30-utilities\") pod \"fa264969-12b7-415a-a569-0f2cab5d3a30\" (UID: \"fa264969-12b7-415a-a569-0f2cab5d3a30\") " Oct 11 03:56:53 crc kubenswrapper[4703]: I1011 03:56:53.670332 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa264969-12b7-415a-a569-0f2cab5d3a30-catalog-content\") pod \"fa264969-12b7-415a-a569-0f2cab5d3a30\" (UID: \"fa264969-12b7-415a-a569-0f2cab5d3a30\") " Oct 11 03:56:53 crc kubenswrapper[4703]: I1011 03:56:53.671212 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fa264969-12b7-415a-a569-0f2cab5d3a30-utilities" (OuterVolumeSpecName: "utilities") pod "fa264969-12b7-415a-a569-0f2cab5d3a30" (UID: "fa264969-12b7-415a-a569-0f2cab5d3a30"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 03:56:53 crc kubenswrapper[4703]: I1011 03:56:53.675960 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa264969-12b7-415a-a569-0f2cab5d3a30-kube-api-access-ss7pb" (OuterVolumeSpecName: "kube-api-access-ss7pb") pod "fa264969-12b7-415a-a569-0f2cab5d3a30" (UID: "fa264969-12b7-415a-a569-0f2cab5d3a30"). InnerVolumeSpecName "kube-api-access-ss7pb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 03:56:53 crc kubenswrapper[4703]: I1011 03:56:53.771873 4703 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa264969-12b7-415a-a569-0f2cab5d3a30-utilities\") on node \"crc\" DevicePath \"\"" Oct 11 03:56:53 crc kubenswrapper[4703]: I1011 03:56:53.771950 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ss7pb\" (UniqueName: \"kubernetes.io/projected/fa264969-12b7-415a-a569-0f2cab5d3a30-kube-api-access-ss7pb\") on node \"crc\" DevicePath \"\"" Oct 11 03:56:53 crc kubenswrapper[4703]: I1011 03:56:53.777024 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fa264969-12b7-415a-a569-0f2cab5d3a30-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fa264969-12b7-415a-a569-0f2cab5d3a30" (UID: "fa264969-12b7-415a-a569-0f2cab5d3a30"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 03:56:53 crc kubenswrapper[4703]: I1011 03:56:53.864851 4703 generic.go:334] "Generic (PLEG): container finished" podID="fa264969-12b7-415a-a569-0f2cab5d3a30" containerID="f20f7554a84622b68cc2fe762ed454e175607ea68f10958ae1ca22bde4a0e0bb" exitCode=0 Oct 11 03:56:53 crc kubenswrapper[4703]: I1011 03:56:53.864919 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v74rb" event={"ID":"fa264969-12b7-415a-a569-0f2cab5d3a30","Type":"ContainerDied","Data":"f20f7554a84622b68cc2fe762ed454e175607ea68f10958ae1ca22bde4a0e0bb"} Oct 11 03:56:53 crc kubenswrapper[4703]: I1011 03:56:53.865055 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v74rb" event={"ID":"fa264969-12b7-415a-a569-0f2cab5d3a30","Type":"ContainerDied","Data":"2e1bd993e04cf8bfa6730bf30d15c5b8282d3bcc0a0cbd8ff1923150a7a9ef69"} Oct 11 03:56:53 crc kubenswrapper[4703]: I1011 03:56:53.865089 4703 scope.go:117] "RemoveContainer" containerID="f20f7554a84622b68cc2fe762ed454e175607ea68f10958ae1ca22bde4a0e0bb" Oct 11 03:56:53 crc kubenswrapper[4703]: I1011 03:56:53.864960 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-v74rb" Oct 11 03:56:53 crc kubenswrapper[4703]: I1011 03:56:53.874242 4703 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa264969-12b7-415a-a569-0f2cab5d3a30-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 11 03:56:53 crc kubenswrapper[4703]: I1011 03:56:53.884367 4703 scope.go:117] "RemoveContainer" containerID="33ece0de46ab457deb5f6692c79a83433693020941ad73570bd19164f23ac92f" Oct 11 03:56:53 crc kubenswrapper[4703]: I1011 03:56:53.908666 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-v74rb"] Oct 11 03:56:53 crc kubenswrapper[4703]: I1011 03:56:53.919144 4703 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-v74rb"] Oct 11 03:56:53 crc kubenswrapper[4703]: I1011 03:56:53.928871 4703 scope.go:117] "RemoveContainer" containerID="4144da6a1a0db56cd51cfde488ed271f6d130911f2e33b37e8fc88be8fcff3d9" Oct 11 03:56:53 crc kubenswrapper[4703]: I1011 03:56:53.940108 4703 scope.go:117] "RemoveContainer" containerID="f20f7554a84622b68cc2fe762ed454e175607ea68f10958ae1ca22bde4a0e0bb" Oct 11 03:56:53 crc kubenswrapper[4703]: E1011 03:56:53.940546 4703 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f20f7554a84622b68cc2fe762ed454e175607ea68f10958ae1ca22bde4a0e0bb\": container with ID starting with f20f7554a84622b68cc2fe762ed454e175607ea68f10958ae1ca22bde4a0e0bb not found: ID does not exist" containerID="f20f7554a84622b68cc2fe762ed454e175607ea68f10958ae1ca22bde4a0e0bb" Oct 11 03:56:53 crc kubenswrapper[4703]: I1011 03:56:53.940673 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f20f7554a84622b68cc2fe762ed454e175607ea68f10958ae1ca22bde4a0e0bb"} err="failed to get container status \"f20f7554a84622b68cc2fe762ed454e175607ea68f10958ae1ca22bde4a0e0bb\": rpc error: code = NotFound desc = could not find container \"f20f7554a84622b68cc2fe762ed454e175607ea68f10958ae1ca22bde4a0e0bb\": container with ID starting with f20f7554a84622b68cc2fe762ed454e175607ea68f10958ae1ca22bde4a0e0bb not found: ID does not exist" Oct 11 03:56:53 crc kubenswrapper[4703]: I1011 03:56:53.940768 4703 scope.go:117] "RemoveContainer" containerID="33ece0de46ab457deb5f6692c79a83433693020941ad73570bd19164f23ac92f" Oct 11 03:56:53 crc kubenswrapper[4703]: E1011 03:56:53.941205 4703 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"33ece0de46ab457deb5f6692c79a83433693020941ad73570bd19164f23ac92f\": container with ID starting with 33ece0de46ab457deb5f6692c79a83433693020941ad73570bd19164f23ac92f not found: ID does not exist" containerID="33ece0de46ab457deb5f6692c79a83433693020941ad73570bd19164f23ac92f" Oct 11 03:56:53 crc kubenswrapper[4703]: I1011 03:56:53.941272 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"33ece0de46ab457deb5f6692c79a83433693020941ad73570bd19164f23ac92f"} err="failed to get container status \"33ece0de46ab457deb5f6692c79a83433693020941ad73570bd19164f23ac92f\": rpc error: code = NotFound desc = could not find container \"33ece0de46ab457deb5f6692c79a83433693020941ad73570bd19164f23ac92f\": container with ID starting with 33ece0de46ab457deb5f6692c79a83433693020941ad73570bd19164f23ac92f not found: ID does not exist" Oct 11 03:56:53 crc kubenswrapper[4703]: I1011 03:56:53.941319 4703 scope.go:117] "RemoveContainer" containerID="4144da6a1a0db56cd51cfde488ed271f6d130911f2e33b37e8fc88be8fcff3d9" Oct 11 03:56:53 crc kubenswrapper[4703]: E1011 03:56:53.941937 4703 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4144da6a1a0db56cd51cfde488ed271f6d130911f2e33b37e8fc88be8fcff3d9\": container with ID starting with 4144da6a1a0db56cd51cfde488ed271f6d130911f2e33b37e8fc88be8fcff3d9 not found: ID does not exist" containerID="4144da6a1a0db56cd51cfde488ed271f6d130911f2e33b37e8fc88be8fcff3d9" Oct 11 03:56:53 crc kubenswrapper[4703]: I1011 03:56:53.941968 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4144da6a1a0db56cd51cfde488ed271f6d130911f2e33b37e8fc88be8fcff3d9"} err="failed to get container status \"4144da6a1a0db56cd51cfde488ed271f6d130911f2e33b37e8fc88be8fcff3d9\": rpc error: code = NotFound desc = could not find container \"4144da6a1a0db56cd51cfde488ed271f6d130911f2e33b37e8fc88be8fcff3d9\": container with ID starting with 4144da6a1a0db56cd51cfde488ed271f6d130911f2e33b37e8fc88be8fcff3d9 not found: ID does not exist" Oct 11 03:56:54 crc kubenswrapper[4703]: I1011 03:56:54.409969 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-bz6vn"] Oct 11 03:56:54 crc kubenswrapper[4703]: I1011 03:56:54.551535 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Oct 11 03:56:55 crc kubenswrapper[4703]: I1011 03:56:55.539807 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fa264969-12b7-415a-a569-0f2cab5d3a30" path="/var/lib/kubelet/pods/fa264969-12b7-415a-a569-0f2cab5d3a30/volumes" Oct 11 03:56:57 crc kubenswrapper[4703]: I1011 03:56:57.129278 4703 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-z4mh8" Oct 11 03:56:57 crc kubenswrapper[4703]: I1011 03:56:57.129753 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-z4mh8" Oct 11 03:56:57 crc kubenswrapper[4703]: I1011 03:56:57.177482 4703 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-z4mh8" Oct 11 03:56:57 crc kubenswrapper[4703]: I1011 03:56:57.234936 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=3.234908514 podStartE2EDuration="3.234908514s" podCreationTimestamp="2025-10-11 03:56:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 03:56:57.228148436 +0000 UTC m=+108.438630368" watchObservedRunningTime="2025-10-11 03:56:57.234908514 +0000 UTC m=+108.445390436" Oct 11 03:56:57 crc kubenswrapper[4703]: I1011 03:56:57.580576 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-wm7q9" Oct 11 03:56:57 crc kubenswrapper[4703]: I1011 03:56:57.631764 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-wm7q9"] Oct 11 03:56:57 crc kubenswrapper[4703]: I1011 03:56:57.889970 4703 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-wm7q9" podUID="7dfc7ef5-c10c-4282-9857-ad5a2fb6e356" containerName="registry-server" containerID="cri-o://dd01e7dd62568a89cd5af117fe12d7121138ea5da2fa4c50a741536ec675268e" gracePeriod=2 Oct 11 03:56:57 crc kubenswrapper[4703]: I1011 03:56:57.932176 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-z4mh8" Oct 11 03:56:58 crc kubenswrapper[4703]: I1011 03:56:58.286228 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wm7q9" Oct 11 03:56:58 crc kubenswrapper[4703]: I1011 03:56:58.345199 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7dfc7ef5-c10c-4282-9857-ad5a2fb6e356-utilities\") pod \"7dfc7ef5-c10c-4282-9857-ad5a2fb6e356\" (UID: \"7dfc7ef5-c10c-4282-9857-ad5a2fb6e356\") " Oct 11 03:56:58 crc kubenswrapper[4703]: I1011 03:56:58.345266 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7dfc7ef5-c10c-4282-9857-ad5a2fb6e356-catalog-content\") pod \"7dfc7ef5-c10c-4282-9857-ad5a2fb6e356\" (UID: \"7dfc7ef5-c10c-4282-9857-ad5a2fb6e356\") " Oct 11 03:56:58 crc kubenswrapper[4703]: I1011 03:56:58.345345 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lnblj\" (UniqueName: \"kubernetes.io/projected/7dfc7ef5-c10c-4282-9857-ad5a2fb6e356-kube-api-access-lnblj\") pod \"7dfc7ef5-c10c-4282-9857-ad5a2fb6e356\" (UID: \"7dfc7ef5-c10c-4282-9857-ad5a2fb6e356\") " Oct 11 03:56:58 crc kubenswrapper[4703]: I1011 03:56:58.346618 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7dfc7ef5-c10c-4282-9857-ad5a2fb6e356-utilities" (OuterVolumeSpecName: "utilities") pod "7dfc7ef5-c10c-4282-9857-ad5a2fb6e356" (UID: "7dfc7ef5-c10c-4282-9857-ad5a2fb6e356"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 03:56:58 crc kubenswrapper[4703]: I1011 03:56:58.355658 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7dfc7ef5-c10c-4282-9857-ad5a2fb6e356-kube-api-access-lnblj" (OuterVolumeSpecName: "kube-api-access-lnblj") pod "7dfc7ef5-c10c-4282-9857-ad5a2fb6e356" (UID: "7dfc7ef5-c10c-4282-9857-ad5a2fb6e356"). InnerVolumeSpecName "kube-api-access-lnblj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 03:56:58 crc kubenswrapper[4703]: I1011 03:56:58.393092 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7dfc7ef5-c10c-4282-9857-ad5a2fb6e356-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7dfc7ef5-c10c-4282-9857-ad5a2fb6e356" (UID: "7dfc7ef5-c10c-4282-9857-ad5a2fb6e356"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 03:56:58 crc kubenswrapper[4703]: I1011 03:56:58.446699 4703 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7dfc7ef5-c10c-4282-9857-ad5a2fb6e356-utilities\") on node \"crc\" DevicePath \"\"" Oct 11 03:56:58 crc kubenswrapper[4703]: I1011 03:56:58.446745 4703 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7dfc7ef5-c10c-4282-9857-ad5a2fb6e356-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 11 03:56:58 crc kubenswrapper[4703]: I1011 03:56:58.446783 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lnblj\" (UniqueName: \"kubernetes.io/projected/7dfc7ef5-c10c-4282-9857-ad5a2fb6e356-kube-api-access-lnblj\") on node \"crc\" DevicePath \"\"" Oct 11 03:56:58 crc kubenswrapper[4703]: I1011 03:56:58.898674 4703 generic.go:334] "Generic (PLEG): container finished" podID="7dfc7ef5-c10c-4282-9857-ad5a2fb6e356" containerID="dd01e7dd62568a89cd5af117fe12d7121138ea5da2fa4c50a741536ec675268e" exitCode=0 Oct 11 03:56:58 crc kubenswrapper[4703]: I1011 03:56:58.898755 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wm7q9" event={"ID":"7dfc7ef5-c10c-4282-9857-ad5a2fb6e356","Type":"ContainerDied","Data":"dd01e7dd62568a89cd5af117fe12d7121138ea5da2fa4c50a741536ec675268e"} Oct 11 03:56:58 crc kubenswrapper[4703]: I1011 03:56:58.898794 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wm7q9" Oct 11 03:56:58 crc kubenswrapper[4703]: I1011 03:56:58.898809 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wm7q9" event={"ID":"7dfc7ef5-c10c-4282-9857-ad5a2fb6e356","Type":"ContainerDied","Data":"74d26f6e65937ff776b7ffd464f56be7a95c0accab4d0fc032d4d5a1a23d5fbd"} Oct 11 03:56:58 crc kubenswrapper[4703]: I1011 03:56:58.898828 4703 scope.go:117] "RemoveContainer" containerID="dd01e7dd62568a89cd5af117fe12d7121138ea5da2fa4c50a741536ec675268e" Oct 11 03:56:58 crc kubenswrapper[4703]: I1011 03:56:58.914287 4703 scope.go:117] "RemoveContainer" containerID="185f35149d04573e318d19fbd650119f12a5e2c6fbfda478165f81a37a819572" Oct 11 03:56:58 crc kubenswrapper[4703]: I1011 03:56:58.924751 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-wm7q9"] Oct 11 03:56:58 crc kubenswrapper[4703]: I1011 03:56:58.927626 4703 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-wm7q9"] Oct 11 03:56:58 crc kubenswrapper[4703]: I1011 03:56:58.936537 4703 scope.go:117] "RemoveContainer" containerID="d3081d161fc7e1f0c09fb4fe84c8c0112c74db62458d1555cdb940fe95b27aa1" Oct 11 03:56:58 crc kubenswrapper[4703]: I1011 03:56:58.961059 4703 scope.go:117] "RemoveContainer" containerID="dd01e7dd62568a89cd5af117fe12d7121138ea5da2fa4c50a741536ec675268e" Oct 11 03:56:58 crc kubenswrapper[4703]: E1011 03:56:58.961774 4703 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dd01e7dd62568a89cd5af117fe12d7121138ea5da2fa4c50a741536ec675268e\": container with ID starting with dd01e7dd62568a89cd5af117fe12d7121138ea5da2fa4c50a741536ec675268e not found: ID does not exist" containerID="dd01e7dd62568a89cd5af117fe12d7121138ea5da2fa4c50a741536ec675268e" Oct 11 03:56:58 crc kubenswrapper[4703]: I1011 03:56:58.961809 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd01e7dd62568a89cd5af117fe12d7121138ea5da2fa4c50a741536ec675268e"} err="failed to get container status \"dd01e7dd62568a89cd5af117fe12d7121138ea5da2fa4c50a741536ec675268e\": rpc error: code = NotFound desc = could not find container \"dd01e7dd62568a89cd5af117fe12d7121138ea5da2fa4c50a741536ec675268e\": container with ID starting with dd01e7dd62568a89cd5af117fe12d7121138ea5da2fa4c50a741536ec675268e not found: ID does not exist" Oct 11 03:56:58 crc kubenswrapper[4703]: I1011 03:56:58.961854 4703 scope.go:117] "RemoveContainer" containerID="185f35149d04573e318d19fbd650119f12a5e2c6fbfda478165f81a37a819572" Oct 11 03:56:58 crc kubenswrapper[4703]: E1011 03:56:58.962524 4703 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"185f35149d04573e318d19fbd650119f12a5e2c6fbfda478165f81a37a819572\": container with ID starting with 185f35149d04573e318d19fbd650119f12a5e2c6fbfda478165f81a37a819572 not found: ID does not exist" containerID="185f35149d04573e318d19fbd650119f12a5e2c6fbfda478165f81a37a819572" Oct 11 03:56:58 crc kubenswrapper[4703]: I1011 03:56:58.962576 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"185f35149d04573e318d19fbd650119f12a5e2c6fbfda478165f81a37a819572"} err="failed to get container status \"185f35149d04573e318d19fbd650119f12a5e2c6fbfda478165f81a37a819572\": rpc error: code = NotFound desc = could not find container \"185f35149d04573e318d19fbd650119f12a5e2c6fbfda478165f81a37a819572\": container with ID starting with 185f35149d04573e318d19fbd650119f12a5e2c6fbfda478165f81a37a819572 not found: ID does not exist" Oct 11 03:56:58 crc kubenswrapper[4703]: I1011 03:56:58.962606 4703 scope.go:117] "RemoveContainer" containerID="d3081d161fc7e1f0c09fb4fe84c8c0112c74db62458d1555cdb940fe95b27aa1" Oct 11 03:56:58 crc kubenswrapper[4703]: E1011 03:56:58.962844 4703 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d3081d161fc7e1f0c09fb4fe84c8c0112c74db62458d1555cdb940fe95b27aa1\": container with ID starting with d3081d161fc7e1f0c09fb4fe84c8c0112c74db62458d1555cdb940fe95b27aa1 not found: ID does not exist" containerID="d3081d161fc7e1f0c09fb4fe84c8c0112c74db62458d1555cdb940fe95b27aa1" Oct 11 03:56:58 crc kubenswrapper[4703]: I1011 03:56:58.962864 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d3081d161fc7e1f0c09fb4fe84c8c0112c74db62458d1555cdb940fe95b27aa1"} err="failed to get container status \"d3081d161fc7e1f0c09fb4fe84c8c0112c74db62458d1555cdb940fe95b27aa1\": rpc error: code = NotFound desc = could not find container \"d3081d161fc7e1f0c09fb4fe84c8c0112c74db62458d1555cdb940fe95b27aa1\": container with ID starting with d3081d161fc7e1f0c09fb4fe84c8c0112c74db62458d1555cdb940fe95b27aa1 not found: ID does not exist" Oct 11 03:56:59 crc kubenswrapper[4703]: I1011 03:56:59.542815 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7dfc7ef5-c10c-4282-9857-ad5a2fb6e356" path="/var/lib/kubelet/pods/7dfc7ef5-c10c-4282-9857-ad5a2fb6e356/volumes" Oct 11 03:57:19 crc kubenswrapper[4703]: I1011 03:57:19.446259 4703 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-bz6vn" podUID="be7d6fb4-1943-4d7f-87a8-ce906ed31cf0" containerName="oauth-openshift" containerID="cri-o://53a97d2ac6def2498603c9f3ea25b9fc5b7086d04787e3bb2e81ab040d82c7d4" gracePeriod=15 Oct 11 03:57:19 crc kubenswrapper[4703]: I1011 03:57:19.930348 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-bz6vn" Oct 11 03:57:19 crc kubenswrapper[4703]: I1011 03:57:19.976896 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-759d657776-92g6w"] Oct 11 03:57:19 crc kubenswrapper[4703]: E1011 03:57:19.977190 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="629e92e9-ec79-44b4-ab21-7b44609e75b3" containerName="extract-content" Oct 11 03:57:19 crc kubenswrapper[4703]: I1011 03:57:19.977212 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="629e92e9-ec79-44b4-ab21-7b44609e75b3" containerName="extract-content" Oct 11 03:57:19 crc kubenswrapper[4703]: E1011 03:57:19.977236 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd430f8d-f9b2-4b4e-9966-5fd12e03e0df" containerName="extract-content" Oct 11 03:57:19 crc kubenswrapper[4703]: I1011 03:57:19.977249 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd430f8d-f9b2-4b4e-9966-5fd12e03e0df" containerName="extract-content" Oct 11 03:57:19 crc kubenswrapper[4703]: E1011 03:57:19.977270 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01618421-523c-44ab-b631-f2624ba8ab2d" containerName="kube-multus-additional-cni-plugins" Oct 11 03:57:19 crc kubenswrapper[4703]: I1011 03:57:19.977284 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="01618421-523c-44ab-b631-f2624ba8ab2d" containerName="kube-multus-additional-cni-plugins" Oct 11 03:57:19 crc kubenswrapper[4703]: E1011 03:57:19.977305 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd430f8d-f9b2-4b4e-9966-5fd12e03e0df" containerName="extract-utilities" Oct 11 03:57:19 crc kubenswrapper[4703]: I1011 03:57:19.977319 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd430f8d-f9b2-4b4e-9966-5fd12e03e0df" containerName="extract-utilities" Oct 11 03:57:19 crc kubenswrapper[4703]: E1011 03:57:19.977337 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa264969-12b7-415a-a569-0f2cab5d3a30" containerName="extract-content" Oct 11 03:57:19 crc kubenswrapper[4703]: I1011 03:57:19.977350 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa264969-12b7-415a-a569-0f2cab5d3a30" containerName="extract-content" Oct 11 03:57:19 crc kubenswrapper[4703]: E1011 03:57:19.977369 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83320069-4d0c-4e40-8a7f-b3bc4beda7a1" containerName="collect-profiles" Oct 11 03:57:19 crc kubenswrapper[4703]: I1011 03:57:19.977382 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="83320069-4d0c-4e40-8a7f-b3bc4beda7a1" containerName="collect-profiles" Oct 11 03:57:19 crc kubenswrapper[4703]: E1011 03:57:19.981532 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa264969-12b7-415a-a569-0f2cab5d3a30" containerName="extract-utilities" Oct 11 03:57:19 crc kubenswrapper[4703]: I1011 03:57:19.981580 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa264969-12b7-415a-a569-0f2cab5d3a30" containerName="extract-utilities" Oct 11 03:57:19 crc kubenswrapper[4703]: E1011 03:57:19.981602 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7dfc7ef5-c10c-4282-9857-ad5a2fb6e356" containerName="registry-server" Oct 11 03:57:19 crc kubenswrapper[4703]: I1011 03:57:19.981619 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="7dfc7ef5-c10c-4282-9857-ad5a2fb6e356" containerName="registry-server" Oct 11 03:57:19 crc kubenswrapper[4703]: E1011 03:57:19.981643 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7dfc7ef5-c10c-4282-9857-ad5a2fb6e356" containerName="extract-utilities" Oct 11 03:57:19 crc kubenswrapper[4703]: I1011 03:57:19.981656 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="7dfc7ef5-c10c-4282-9857-ad5a2fb6e356" containerName="extract-utilities" Oct 11 03:57:19 crc kubenswrapper[4703]: E1011 03:57:19.981676 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="629e92e9-ec79-44b4-ab21-7b44609e75b3" containerName="registry-server" Oct 11 03:57:19 crc kubenswrapper[4703]: I1011 03:57:19.981689 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="629e92e9-ec79-44b4-ab21-7b44609e75b3" containerName="registry-server" Oct 11 03:57:19 crc kubenswrapper[4703]: E1011 03:57:19.981710 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd430f8d-f9b2-4b4e-9966-5fd12e03e0df" containerName="registry-server" Oct 11 03:57:19 crc kubenswrapper[4703]: I1011 03:57:19.981722 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd430f8d-f9b2-4b4e-9966-5fd12e03e0df" containerName="registry-server" Oct 11 03:57:19 crc kubenswrapper[4703]: E1011 03:57:19.981742 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="629e92e9-ec79-44b4-ab21-7b44609e75b3" containerName="extract-utilities" Oct 11 03:57:19 crc kubenswrapper[4703]: I1011 03:57:19.981756 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="629e92e9-ec79-44b4-ab21-7b44609e75b3" containerName="extract-utilities" Oct 11 03:57:19 crc kubenswrapper[4703]: E1011 03:57:19.981775 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7dfc7ef5-c10c-4282-9857-ad5a2fb6e356" containerName="extract-content" Oct 11 03:57:19 crc kubenswrapper[4703]: I1011 03:57:19.981789 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="7dfc7ef5-c10c-4282-9857-ad5a2fb6e356" containerName="extract-content" Oct 11 03:57:19 crc kubenswrapper[4703]: E1011 03:57:19.981807 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86958692-3361-419f-87ef-9b4fe71f9d62" containerName="pruner" Oct 11 03:57:19 crc kubenswrapper[4703]: I1011 03:57:19.981822 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="86958692-3361-419f-87ef-9b4fe71f9d62" containerName="pruner" Oct 11 03:57:19 crc kubenswrapper[4703]: E1011 03:57:19.981842 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be7d6fb4-1943-4d7f-87a8-ce906ed31cf0" containerName="oauth-openshift" Oct 11 03:57:19 crc kubenswrapper[4703]: I1011 03:57:19.981855 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="be7d6fb4-1943-4d7f-87a8-ce906ed31cf0" containerName="oauth-openshift" Oct 11 03:57:19 crc kubenswrapper[4703]: E1011 03:57:19.981875 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa264969-12b7-415a-a569-0f2cab5d3a30" containerName="registry-server" Oct 11 03:57:19 crc kubenswrapper[4703]: I1011 03:57:19.981891 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa264969-12b7-415a-a569-0f2cab5d3a30" containerName="registry-server" Oct 11 03:57:19 crc kubenswrapper[4703]: E1011 03:57:19.981907 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1809da0d-af06-47ca-bab2-d26183265884" containerName="pruner" Oct 11 03:57:19 crc kubenswrapper[4703]: I1011 03:57:19.981920 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="1809da0d-af06-47ca-bab2-d26183265884" containerName="pruner" Oct 11 03:57:19 crc kubenswrapper[4703]: I1011 03:57:19.982154 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="1809da0d-af06-47ca-bab2-d26183265884" containerName="pruner" Oct 11 03:57:19 crc kubenswrapper[4703]: I1011 03:57:19.982172 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa264969-12b7-415a-a569-0f2cab5d3a30" containerName="registry-server" Oct 11 03:57:19 crc kubenswrapper[4703]: I1011 03:57:19.982192 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="fd430f8d-f9b2-4b4e-9966-5fd12e03e0df" containerName="registry-server" Oct 11 03:57:19 crc kubenswrapper[4703]: I1011 03:57:19.982213 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="be7d6fb4-1943-4d7f-87a8-ce906ed31cf0" containerName="oauth-openshift" Oct 11 03:57:19 crc kubenswrapper[4703]: I1011 03:57:19.982230 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="01618421-523c-44ab-b631-f2624ba8ab2d" containerName="kube-multus-additional-cni-plugins" Oct 11 03:57:19 crc kubenswrapper[4703]: I1011 03:57:19.982252 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="629e92e9-ec79-44b4-ab21-7b44609e75b3" containerName="registry-server" Oct 11 03:57:19 crc kubenswrapper[4703]: I1011 03:57:19.982265 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="83320069-4d0c-4e40-8a7f-b3bc4beda7a1" containerName="collect-profiles" Oct 11 03:57:19 crc kubenswrapper[4703]: I1011 03:57:19.982284 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="86958692-3361-419f-87ef-9b4fe71f9d62" containerName="pruner" Oct 11 03:57:19 crc kubenswrapper[4703]: I1011 03:57:19.982297 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="7dfc7ef5-c10c-4282-9857-ad5a2fb6e356" containerName="registry-server" Oct 11 03:57:19 crc kubenswrapper[4703]: I1011 03:57:19.982950 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-759d657776-92g6w" Oct 11 03:57:19 crc kubenswrapper[4703]: I1011 03:57:19.983315 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-759d657776-92g6w"] Oct 11 03:57:20 crc kubenswrapper[4703]: I1011 03:57:20.027174 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/be7d6fb4-1943-4d7f-87a8-ce906ed31cf0-v4-0-config-user-template-error\") pod \"be7d6fb4-1943-4d7f-87a8-ce906ed31cf0\" (UID: \"be7d6fb4-1943-4d7f-87a8-ce906ed31cf0\") " Oct 11 03:57:20 crc kubenswrapper[4703]: I1011 03:57:20.027264 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/be7d6fb4-1943-4d7f-87a8-ce906ed31cf0-v4-0-config-system-serving-cert\") pod \"be7d6fb4-1943-4d7f-87a8-ce906ed31cf0\" (UID: \"be7d6fb4-1943-4d7f-87a8-ce906ed31cf0\") " Oct 11 03:57:20 crc kubenswrapper[4703]: I1011 03:57:20.027314 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/be7d6fb4-1943-4d7f-87a8-ce906ed31cf0-v4-0-config-system-router-certs\") pod \"be7d6fb4-1943-4d7f-87a8-ce906ed31cf0\" (UID: \"be7d6fb4-1943-4d7f-87a8-ce906ed31cf0\") " Oct 11 03:57:20 crc kubenswrapper[4703]: I1011 03:57:20.027383 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xr6bd\" (UniqueName: \"kubernetes.io/projected/be7d6fb4-1943-4d7f-87a8-ce906ed31cf0-kube-api-access-xr6bd\") pod \"be7d6fb4-1943-4d7f-87a8-ce906ed31cf0\" (UID: \"be7d6fb4-1943-4d7f-87a8-ce906ed31cf0\") " Oct 11 03:57:20 crc kubenswrapper[4703]: I1011 03:57:20.027415 4703 generic.go:334] "Generic (PLEG): container finished" podID="be7d6fb4-1943-4d7f-87a8-ce906ed31cf0" containerID="53a97d2ac6def2498603c9f3ea25b9fc5b7086d04787e3bb2e81ab040d82c7d4" exitCode=0 Oct 11 03:57:20 crc kubenswrapper[4703]: I1011 03:57:20.027444 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/be7d6fb4-1943-4d7f-87a8-ce906ed31cf0-v4-0-config-system-cliconfig\") pod \"be7d6fb4-1943-4d7f-87a8-ce906ed31cf0\" (UID: \"be7d6fb4-1943-4d7f-87a8-ce906ed31cf0\") " Oct 11 03:57:20 crc kubenswrapper[4703]: I1011 03:57:20.027455 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-bz6vn" event={"ID":"be7d6fb4-1943-4d7f-87a8-ce906ed31cf0","Type":"ContainerDied","Data":"53a97d2ac6def2498603c9f3ea25b9fc5b7086d04787e3bb2e81ab040d82c7d4"} Oct 11 03:57:20 crc kubenswrapper[4703]: I1011 03:57:20.027515 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-bz6vn" event={"ID":"be7d6fb4-1943-4d7f-87a8-ce906ed31cf0","Type":"ContainerDied","Data":"7662e5490134c9c4bf21ca70e3bab938a356f4cd0598d63d768994f46ac826c6"} Oct 11 03:57:20 crc kubenswrapper[4703]: I1011 03:57:20.027536 4703 scope.go:117] "RemoveContainer" containerID="53a97d2ac6def2498603c9f3ea25b9fc5b7086d04787e3bb2e81ab040d82c7d4" Oct 11 03:57:20 crc kubenswrapper[4703]: I1011 03:57:20.027557 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/be7d6fb4-1943-4d7f-87a8-ce906ed31cf0-audit-policies\") pod \"be7d6fb4-1943-4d7f-87a8-ce906ed31cf0\" (UID: \"be7d6fb4-1943-4d7f-87a8-ce906ed31cf0\") " Oct 11 03:57:20 crc kubenswrapper[4703]: I1011 03:57:20.027601 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/be7d6fb4-1943-4d7f-87a8-ce906ed31cf0-v4-0-config-user-template-provider-selection\") pod \"be7d6fb4-1943-4d7f-87a8-ce906ed31cf0\" (UID: \"be7d6fb4-1943-4d7f-87a8-ce906ed31cf0\") " Oct 11 03:57:20 crc kubenswrapper[4703]: I1011 03:57:20.027638 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-bz6vn" Oct 11 03:57:20 crc kubenswrapper[4703]: I1011 03:57:20.027662 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/be7d6fb4-1943-4d7f-87a8-ce906ed31cf0-v4-0-config-system-ocp-branding-template\") pod \"be7d6fb4-1943-4d7f-87a8-ce906ed31cf0\" (UID: \"be7d6fb4-1943-4d7f-87a8-ce906ed31cf0\") " Oct 11 03:57:20 crc kubenswrapper[4703]: I1011 03:57:20.027699 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/be7d6fb4-1943-4d7f-87a8-ce906ed31cf0-v4-0-config-system-session\") pod \"be7d6fb4-1943-4d7f-87a8-ce906ed31cf0\" (UID: \"be7d6fb4-1943-4d7f-87a8-ce906ed31cf0\") " Oct 11 03:57:20 crc kubenswrapper[4703]: I1011 03:57:20.027735 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/be7d6fb4-1943-4d7f-87a8-ce906ed31cf0-v4-0-config-system-service-ca\") pod \"be7d6fb4-1943-4d7f-87a8-ce906ed31cf0\" (UID: \"be7d6fb4-1943-4d7f-87a8-ce906ed31cf0\") " Oct 11 03:57:20 crc kubenswrapper[4703]: I1011 03:57:20.027784 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/be7d6fb4-1943-4d7f-87a8-ce906ed31cf0-v4-0-config-system-trusted-ca-bundle\") pod \"be7d6fb4-1943-4d7f-87a8-ce906ed31cf0\" (UID: \"be7d6fb4-1943-4d7f-87a8-ce906ed31cf0\") " Oct 11 03:57:20 crc kubenswrapper[4703]: I1011 03:57:20.027819 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/be7d6fb4-1943-4d7f-87a8-ce906ed31cf0-audit-dir\") pod \"be7d6fb4-1943-4d7f-87a8-ce906ed31cf0\" (UID: \"be7d6fb4-1943-4d7f-87a8-ce906ed31cf0\") " Oct 11 03:57:20 crc kubenswrapper[4703]: I1011 03:57:20.027853 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/be7d6fb4-1943-4d7f-87a8-ce906ed31cf0-v4-0-config-user-idp-0-file-data\") pod \"be7d6fb4-1943-4d7f-87a8-ce906ed31cf0\" (UID: \"be7d6fb4-1943-4d7f-87a8-ce906ed31cf0\") " Oct 11 03:57:20 crc kubenswrapper[4703]: I1011 03:57:20.027887 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/be7d6fb4-1943-4d7f-87a8-ce906ed31cf0-v4-0-config-user-template-login\") pod \"be7d6fb4-1943-4d7f-87a8-ce906ed31cf0\" (UID: \"be7d6fb4-1943-4d7f-87a8-ce906ed31cf0\") " Oct 11 03:57:20 crc kubenswrapper[4703]: I1011 03:57:20.028071 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/d914cab2-0c88-43b1-b563-627a6d934561-v4-0-config-system-cliconfig\") pod \"oauth-openshift-759d657776-92g6w\" (UID: \"d914cab2-0c88-43b1-b563-627a6d934561\") " pod="openshift-authentication/oauth-openshift-759d657776-92g6w" Oct 11 03:57:20 crc kubenswrapper[4703]: I1011 03:57:20.028122 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/d914cab2-0c88-43b1-b563-627a6d934561-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-759d657776-92g6w\" (UID: \"d914cab2-0c88-43b1-b563-627a6d934561\") " pod="openshift-authentication/oauth-openshift-759d657776-92g6w" Oct 11 03:57:20 crc kubenswrapper[4703]: I1011 03:57:20.028154 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/be7d6fb4-1943-4d7f-87a8-ce906ed31cf0-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "be7d6fb4-1943-4d7f-87a8-ce906ed31cf0" (UID: "be7d6fb4-1943-4d7f-87a8-ce906ed31cf0"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 11 03:57:20 crc kubenswrapper[4703]: I1011 03:57:20.028180 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d914cab2-0c88-43b1-b563-627a6d934561-audit-dir\") pod \"oauth-openshift-759d657776-92g6w\" (UID: \"d914cab2-0c88-43b1-b563-627a6d934561\") " pod="openshift-authentication/oauth-openshift-759d657776-92g6w" Oct 11 03:57:20 crc kubenswrapper[4703]: I1011 03:57:20.028232 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/d914cab2-0c88-43b1-b563-627a6d934561-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-759d657776-92g6w\" (UID: \"d914cab2-0c88-43b1-b563-627a6d934561\") " pod="openshift-authentication/oauth-openshift-759d657776-92g6w" Oct 11 03:57:20 crc kubenswrapper[4703]: I1011 03:57:20.028276 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/d914cab2-0c88-43b1-b563-627a6d934561-v4-0-config-user-template-error\") pod \"oauth-openshift-759d657776-92g6w\" (UID: \"d914cab2-0c88-43b1-b563-627a6d934561\") " pod="openshift-authentication/oauth-openshift-759d657776-92g6w" Oct 11 03:57:20 crc kubenswrapper[4703]: I1011 03:57:20.028310 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/d914cab2-0c88-43b1-b563-627a6d934561-v4-0-config-system-serving-cert\") pod \"oauth-openshift-759d657776-92g6w\" (UID: \"d914cab2-0c88-43b1-b563-627a6d934561\") " pod="openshift-authentication/oauth-openshift-759d657776-92g6w" Oct 11 03:57:20 crc kubenswrapper[4703]: I1011 03:57:20.028349 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/d914cab2-0c88-43b1-b563-627a6d934561-v4-0-config-system-session\") pod \"oauth-openshift-759d657776-92g6w\" (UID: \"d914cab2-0c88-43b1-b563-627a6d934561\") " pod="openshift-authentication/oauth-openshift-759d657776-92g6w" Oct 11 03:57:20 crc kubenswrapper[4703]: I1011 03:57:20.028404 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/d914cab2-0c88-43b1-b563-627a6d934561-v4-0-config-system-service-ca\") pod \"oauth-openshift-759d657776-92g6w\" (UID: \"d914cab2-0c88-43b1-b563-627a6d934561\") " pod="openshift-authentication/oauth-openshift-759d657776-92g6w" Oct 11 03:57:20 crc kubenswrapper[4703]: I1011 03:57:20.028440 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/d914cab2-0c88-43b1-b563-627a6d934561-v4-0-config-system-router-certs\") pod \"oauth-openshift-759d657776-92g6w\" (UID: \"d914cab2-0c88-43b1-b563-627a6d934561\") " pod="openshift-authentication/oauth-openshift-759d657776-92g6w" Oct 11 03:57:20 crc kubenswrapper[4703]: I1011 03:57:20.028511 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d914cab2-0c88-43b1-b563-627a6d934561-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-759d657776-92g6w\" (UID: \"d914cab2-0c88-43b1-b563-627a6d934561\") " pod="openshift-authentication/oauth-openshift-759d657776-92g6w" Oct 11 03:57:20 crc kubenswrapper[4703]: I1011 03:57:20.028550 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/d914cab2-0c88-43b1-b563-627a6d934561-audit-policies\") pod \"oauth-openshift-759d657776-92g6w\" (UID: \"d914cab2-0c88-43b1-b563-627a6d934561\") " pod="openshift-authentication/oauth-openshift-759d657776-92g6w" Oct 11 03:57:20 crc kubenswrapper[4703]: I1011 03:57:20.028602 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/d914cab2-0c88-43b1-b563-627a6d934561-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-759d657776-92g6w\" (UID: \"d914cab2-0c88-43b1-b563-627a6d934561\") " pod="openshift-authentication/oauth-openshift-759d657776-92g6w" Oct 11 03:57:20 crc kubenswrapper[4703]: I1011 03:57:20.028609 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/be7d6fb4-1943-4d7f-87a8-ce906ed31cf0-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "be7d6fb4-1943-4d7f-87a8-ce906ed31cf0" (UID: "be7d6fb4-1943-4d7f-87a8-ce906ed31cf0"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 03:57:20 crc kubenswrapper[4703]: I1011 03:57:20.028710 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/be7d6fb4-1943-4d7f-87a8-ce906ed31cf0-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "be7d6fb4-1943-4d7f-87a8-ce906ed31cf0" (UID: "be7d6fb4-1943-4d7f-87a8-ce906ed31cf0"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 03:57:20 crc kubenswrapper[4703]: I1011 03:57:20.028719 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h25tf\" (UniqueName: \"kubernetes.io/projected/d914cab2-0c88-43b1-b563-627a6d934561-kube-api-access-h25tf\") pod \"oauth-openshift-759d657776-92g6w\" (UID: \"d914cab2-0c88-43b1-b563-627a6d934561\") " pod="openshift-authentication/oauth-openshift-759d657776-92g6w" Oct 11 03:57:20 crc kubenswrapper[4703]: I1011 03:57:20.028794 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/d914cab2-0c88-43b1-b563-627a6d934561-v4-0-config-user-template-login\") pod \"oauth-openshift-759d657776-92g6w\" (UID: \"d914cab2-0c88-43b1-b563-627a6d934561\") " pod="openshift-authentication/oauth-openshift-759d657776-92g6w" Oct 11 03:57:20 crc kubenswrapper[4703]: I1011 03:57:20.029060 4703 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/be7d6fb4-1943-4d7f-87a8-ce906ed31cf0-audit-policies\") on node \"crc\" DevicePath \"\"" Oct 11 03:57:20 crc kubenswrapper[4703]: I1011 03:57:20.029083 4703 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/be7d6fb4-1943-4d7f-87a8-ce906ed31cf0-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Oct 11 03:57:20 crc kubenswrapper[4703]: I1011 03:57:20.029099 4703 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/be7d6fb4-1943-4d7f-87a8-ce906ed31cf0-audit-dir\") on node \"crc\" DevicePath \"\"" Oct 11 03:57:20 crc kubenswrapper[4703]: I1011 03:57:20.029116 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/be7d6fb4-1943-4d7f-87a8-ce906ed31cf0-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "be7d6fb4-1943-4d7f-87a8-ce906ed31cf0" (UID: "be7d6fb4-1943-4d7f-87a8-ce906ed31cf0"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 03:57:20 crc kubenswrapper[4703]: I1011 03:57:20.030367 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/be7d6fb4-1943-4d7f-87a8-ce906ed31cf0-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "be7d6fb4-1943-4d7f-87a8-ce906ed31cf0" (UID: "be7d6fb4-1943-4d7f-87a8-ce906ed31cf0"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 03:57:20 crc kubenswrapper[4703]: I1011 03:57:20.036254 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be7d6fb4-1943-4d7f-87a8-ce906ed31cf0-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "be7d6fb4-1943-4d7f-87a8-ce906ed31cf0" (UID: "be7d6fb4-1943-4d7f-87a8-ce906ed31cf0"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 03:57:20 crc kubenswrapper[4703]: I1011 03:57:20.037274 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be7d6fb4-1943-4d7f-87a8-ce906ed31cf0-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "be7d6fb4-1943-4d7f-87a8-ce906ed31cf0" (UID: "be7d6fb4-1943-4d7f-87a8-ce906ed31cf0"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 03:57:20 crc kubenswrapper[4703]: I1011 03:57:20.037770 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be7d6fb4-1943-4d7f-87a8-ce906ed31cf0-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "be7d6fb4-1943-4d7f-87a8-ce906ed31cf0" (UID: "be7d6fb4-1943-4d7f-87a8-ce906ed31cf0"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 03:57:20 crc kubenswrapper[4703]: I1011 03:57:20.037989 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be7d6fb4-1943-4d7f-87a8-ce906ed31cf0-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "be7d6fb4-1943-4d7f-87a8-ce906ed31cf0" (UID: "be7d6fb4-1943-4d7f-87a8-ce906ed31cf0"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 03:57:20 crc kubenswrapper[4703]: I1011 03:57:20.038111 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be7d6fb4-1943-4d7f-87a8-ce906ed31cf0-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "be7d6fb4-1943-4d7f-87a8-ce906ed31cf0" (UID: "be7d6fb4-1943-4d7f-87a8-ce906ed31cf0"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 03:57:20 crc kubenswrapper[4703]: I1011 03:57:20.038339 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be7d6fb4-1943-4d7f-87a8-ce906ed31cf0-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "be7d6fb4-1943-4d7f-87a8-ce906ed31cf0" (UID: "be7d6fb4-1943-4d7f-87a8-ce906ed31cf0"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 03:57:20 crc kubenswrapper[4703]: I1011 03:57:20.047717 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be7d6fb4-1943-4d7f-87a8-ce906ed31cf0-kube-api-access-xr6bd" (OuterVolumeSpecName: "kube-api-access-xr6bd") pod "be7d6fb4-1943-4d7f-87a8-ce906ed31cf0" (UID: "be7d6fb4-1943-4d7f-87a8-ce906ed31cf0"). InnerVolumeSpecName "kube-api-access-xr6bd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 03:57:20 crc kubenswrapper[4703]: I1011 03:57:20.055942 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be7d6fb4-1943-4d7f-87a8-ce906ed31cf0-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "be7d6fb4-1943-4d7f-87a8-ce906ed31cf0" (UID: "be7d6fb4-1943-4d7f-87a8-ce906ed31cf0"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 03:57:20 crc kubenswrapper[4703]: I1011 03:57:20.059994 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be7d6fb4-1943-4d7f-87a8-ce906ed31cf0-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "be7d6fb4-1943-4d7f-87a8-ce906ed31cf0" (UID: "be7d6fb4-1943-4d7f-87a8-ce906ed31cf0"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 03:57:20 crc kubenswrapper[4703]: I1011 03:57:20.118487 4703 scope.go:117] "RemoveContainer" containerID="53a97d2ac6def2498603c9f3ea25b9fc5b7086d04787e3bb2e81ab040d82c7d4" Oct 11 03:57:20 crc kubenswrapper[4703]: E1011 03:57:20.119053 4703 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"53a97d2ac6def2498603c9f3ea25b9fc5b7086d04787e3bb2e81ab040d82c7d4\": container with ID starting with 53a97d2ac6def2498603c9f3ea25b9fc5b7086d04787e3bb2e81ab040d82c7d4 not found: ID does not exist" containerID="53a97d2ac6def2498603c9f3ea25b9fc5b7086d04787e3bb2e81ab040d82c7d4" Oct 11 03:57:20 crc kubenswrapper[4703]: I1011 03:57:20.119101 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"53a97d2ac6def2498603c9f3ea25b9fc5b7086d04787e3bb2e81ab040d82c7d4"} err="failed to get container status \"53a97d2ac6def2498603c9f3ea25b9fc5b7086d04787e3bb2e81ab040d82c7d4\": rpc error: code = NotFound desc = could not find container \"53a97d2ac6def2498603c9f3ea25b9fc5b7086d04787e3bb2e81ab040d82c7d4\": container with ID starting with 53a97d2ac6def2498603c9f3ea25b9fc5b7086d04787e3bb2e81ab040d82c7d4 not found: ID does not exist" Oct 11 03:57:20 crc kubenswrapper[4703]: I1011 03:57:20.129826 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/d914cab2-0c88-43b1-b563-627a6d934561-v4-0-config-system-session\") pod \"oauth-openshift-759d657776-92g6w\" (UID: \"d914cab2-0c88-43b1-b563-627a6d934561\") " pod="openshift-authentication/oauth-openshift-759d657776-92g6w" Oct 11 03:57:20 crc kubenswrapper[4703]: I1011 03:57:20.129889 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/d914cab2-0c88-43b1-b563-627a6d934561-v4-0-config-system-service-ca\") pod \"oauth-openshift-759d657776-92g6w\" (UID: \"d914cab2-0c88-43b1-b563-627a6d934561\") " pod="openshift-authentication/oauth-openshift-759d657776-92g6w" Oct 11 03:57:20 crc kubenswrapper[4703]: I1011 03:57:20.129914 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/d914cab2-0c88-43b1-b563-627a6d934561-v4-0-config-system-router-certs\") pod \"oauth-openshift-759d657776-92g6w\" (UID: \"d914cab2-0c88-43b1-b563-627a6d934561\") " pod="openshift-authentication/oauth-openshift-759d657776-92g6w" Oct 11 03:57:20 crc kubenswrapper[4703]: I1011 03:57:20.129944 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d914cab2-0c88-43b1-b563-627a6d934561-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-759d657776-92g6w\" (UID: \"d914cab2-0c88-43b1-b563-627a6d934561\") " pod="openshift-authentication/oauth-openshift-759d657776-92g6w" Oct 11 03:57:20 crc kubenswrapper[4703]: I1011 03:57:20.129971 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/d914cab2-0c88-43b1-b563-627a6d934561-audit-policies\") pod \"oauth-openshift-759d657776-92g6w\" (UID: \"d914cab2-0c88-43b1-b563-627a6d934561\") " pod="openshift-authentication/oauth-openshift-759d657776-92g6w" Oct 11 03:57:20 crc kubenswrapper[4703]: I1011 03:57:20.130004 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/d914cab2-0c88-43b1-b563-627a6d934561-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-759d657776-92g6w\" (UID: \"d914cab2-0c88-43b1-b563-627a6d934561\") " pod="openshift-authentication/oauth-openshift-759d657776-92g6w" Oct 11 03:57:20 crc kubenswrapper[4703]: I1011 03:57:20.130033 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h25tf\" (UniqueName: \"kubernetes.io/projected/d914cab2-0c88-43b1-b563-627a6d934561-kube-api-access-h25tf\") pod \"oauth-openshift-759d657776-92g6w\" (UID: \"d914cab2-0c88-43b1-b563-627a6d934561\") " pod="openshift-authentication/oauth-openshift-759d657776-92g6w" Oct 11 03:57:20 crc kubenswrapper[4703]: I1011 03:57:20.130060 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/d914cab2-0c88-43b1-b563-627a6d934561-v4-0-config-user-template-login\") pod \"oauth-openshift-759d657776-92g6w\" (UID: \"d914cab2-0c88-43b1-b563-627a6d934561\") " pod="openshift-authentication/oauth-openshift-759d657776-92g6w" Oct 11 03:57:20 crc kubenswrapper[4703]: I1011 03:57:20.130090 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/d914cab2-0c88-43b1-b563-627a6d934561-v4-0-config-system-cliconfig\") pod \"oauth-openshift-759d657776-92g6w\" (UID: \"d914cab2-0c88-43b1-b563-627a6d934561\") " pod="openshift-authentication/oauth-openshift-759d657776-92g6w" Oct 11 03:57:20 crc kubenswrapper[4703]: I1011 03:57:20.130117 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/d914cab2-0c88-43b1-b563-627a6d934561-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-759d657776-92g6w\" (UID: \"d914cab2-0c88-43b1-b563-627a6d934561\") " pod="openshift-authentication/oauth-openshift-759d657776-92g6w" Oct 11 03:57:20 crc kubenswrapper[4703]: I1011 03:57:20.130152 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d914cab2-0c88-43b1-b563-627a6d934561-audit-dir\") pod \"oauth-openshift-759d657776-92g6w\" (UID: \"d914cab2-0c88-43b1-b563-627a6d934561\") " pod="openshift-authentication/oauth-openshift-759d657776-92g6w" Oct 11 03:57:20 crc kubenswrapper[4703]: I1011 03:57:20.130183 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/d914cab2-0c88-43b1-b563-627a6d934561-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-759d657776-92g6w\" (UID: \"d914cab2-0c88-43b1-b563-627a6d934561\") " pod="openshift-authentication/oauth-openshift-759d657776-92g6w" Oct 11 03:57:20 crc kubenswrapper[4703]: I1011 03:57:20.130209 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/d914cab2-0c88-43b1-b563-627a6d934561-v4-0-config-user-template-error\") pod \"oauth-openshift-759d657776-92g6w\" (UID: \"d914cab2-0c88-43b1-b563-627a6d934561\") " pod="openshift-authentication/oauth-openshift-759d657776-92g6w" Oct 11 03:57:20 crc kubenswrapper[4703]: I1011 03:57:20.130232 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/d914cab2-0c88-43b1-b563-627a6d934561-v4-0-config-system-serving-cert\") pod \"oauth-openshift-759d657776-92g6w\" (UID: \"d914cab2-0c88-43b1-b563-627a6d934561\") " pod="openshift-authentication/oauth-openshift-759d657776-92g6w" Oct 11 03:57:20 crc kubenswrapper[4703]: I1011 03:57:20.130275 4703 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/be7d6fb4-1943-4d7f-87a8-ce906ed31cf0-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Oct 11 03:57:20 crc kubenswrapper[4703]: I1011 03:57:20.130291 4703 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/be7d6fb4-1943-4d7f-87a8-ce906ed31cf0-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Oct 11 03:57:20 crc kubenswrapper[4703]: I1011 03:57:20.130307 4703 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/be7d6fb4-1943-4d7f-87a8-ce906ed31cf0-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Oct 11 03:57:20 crc kubenswrapper[4703]: I1011 03:57:20.130320 4703 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/be7d6fb4-1943-4d7f-87a8-ce906ed31cf0-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 11 03:57:20 crc kubenswrapper[4703]: I1011 03:57:20.130332 4703 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/be7d6fb4-1943-4d7f-87a8-ce906ed31cf0-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Oct 11 03:57:20 crc kubenswrapper[4703]: I1011 03:57:20.130346 4703 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/be7d6fb4-1943-4d7f-87a8-ce906ed31cf0-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Oct 11 03:57:20 crc kubenswrapper[4703]: I1011 03:57:20.130359 4703 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/be7d6fb4-1943-4d7f-87a8-ce906ed31cf0-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Oct 11 03:57:20 crc kubenswrapper[4703]: I1011 03:57:20.130371 4703 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/be7d6fb4-1943-4d7f-87a8-ce906ed31cf0-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 11 03:57:20 crc kubenswrapper[4703]: I1011 03:57:20.130384 4703 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/be7d6fb4-1943-4d7f-87a8-ce906ed31cf0-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Oct 11 03:57:20 crc kubenswrapper[4703]: I1011 03:57:20.130397 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xr6bd\" (UniqueName: \"kubernetes.io/projected/be7d6fb4-1943-4d7f-87a8-ce906ed31cf0-kube-api-access-xr6bd\") on node \"crc\" DevicePath \"\"" Oct 11 03:57:20 crc kubenswrapper[4703]: I1011 03:57:20.130410 4703 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/be7d6fb4-1943-4d7f-87a8-ce906ed31cf0-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Oct 11 03:57:20 crc kubenswrapper[4703]: I1011 03:57:20.131232 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d914cab2-0c88-43b1-b563-627a6d934561-audit-dir\") pod \"oauth-openshift-759d657776-92g6w\" (UID: \"d914cab2-0c88-43b1-b563-627a6d934561\") " pod="openshift-authentication/oauth-openshift-759d657776-92g6w" Oct 11 03:57:20 crc kubenswrapper[4703]: I1011 03:57:20.131801 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/d914cab2-0c88-43b1-b563-627a6d934561-v4-0-config-system-cliconfig\") pod \"oauth-openshift-759d657776-92g6w\" (UID: \"d914cab2-0c88-43b1-b563-627a6d934561\") " pod="openshift-authentication/oauth-openshift-759d657776-92g6w" Oct 11 03:57:20 crc kubenswrapper[4703]: I1011 03:57:20.131966 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/d914cab2-0c88-43b1-b563-627a6d934561-audit-policies\") pod \"oauth-openshift-759d657776-92g6w\" (UID: \"d914cab2-0c88-43b1-b563-627a6d934561\") " pod="openshift-authentication/oauth-openshift-759d657776-92g6w" Oct 11 03:57:20 crc kubenswrapper[4703]: I1011 03:57:20.132850 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/d914cab2-0c88-43b1-b563-627a6d934561-v4-0-config-system-service-ca\") pod \"oauth-openshift-759d657776-92g6w\" (UID: \"d914cab2-0c88-43b1-b563-627a6d934561\") " pod="openshift-authentication/oauth-openshift-759d657776-92g6w" Oct 11 03:57:20 crc kubenswrapper[4703]: I1011 03:57:20.133021 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d914cab2-0c88-43b1-b563-627a6d934561-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-759d657776-92g6w\" (UID: \"d914cab2-0c88-43b1-b563-627a6d934561\") " pod="openshift-authentication/oauth-openshift-759d657776-92g6w" Oct 11 03:57:20 crc kubenswrapper[4703]: I1011 03:57:20.134710 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/d914cab2-0c88-43b1-b563-627a6d934561-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-759d657776-92g6w\" (UID: \"d914cab2-0c88-43b1-b563-627a6d934561\") " pod="openshift-authentication/oauth-openshift-759d657776-92g6w" Oct 11 03:57:20 crc kubenswrapper[4703]: I1011 03:57:20.134927 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/d914cab2-0c88-43b1-b563-627a6d934561-v4-0-config-system-session\") pod \"oauth-openshift-759d657776-92g6w\" (UID: \"d914cab2-0c88-43b1-b563-627a6d934561\") " pod="openshift-authentication/oauth-openshift-759d657776-92g6w" Oct 11 03:57:20 crc kubenswrapper[4703]: I1011 03:57:20.135329 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/d914cab2-0c88-43b1-b563-627a6d934561-v4-0-config-user-template-error\") pod \"oauth-openshift-759d657776-92g6w\" (UID: \"d914cab2-0c88-43b1-b563-627a6d934561\") " pod="openshift-authentication/oauth-openshift-759d657776-92g6w" Oct 11 03:57:20 crc kubenswrapper[4703]: I1011 03:57:20.135823 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/d914cab2-0c88-43b1-b563-627a6d934561-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-759d657776-92g6w\" (UID: \"d914cab2-0c88-43b1-b563-627a6d934561\") " pod="openshift-authentication/oauth-openshift-759d657776-92g6w" Oct 11 03:57:20 crc kubenswrapper[4703]: I1011 03:57:20.135829 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/d914cab2-0c88-43b1-b563-627a6d934561-v4-0-config-system-router-certs\") pod \"oauth-openshift-759d657776-92g6w\" (UID: \"d914cab2-0c88-43b1-b563-627a6d934561\") " pod="openshift-authentication/oauth-openshift-759d657776-92g6w" Oct 11 03:57:20 crc kubenswrapper[4703]: I1011 03:57:20.136888 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/d914cab2-0c88-43b1-b563-627a6d934561-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-759d657776-92g6w\" (UID: \"d914cab2-0c88-43b1-b563-627a6d934561\") " pod="openshift-authentication/oauth-openshift-759d657776-92g6w" Oct 11 03:57:20 crc kubenswrapper[4703]: I1011 03:57:20.137035 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/d914cab2-0c88-43b1-b563-627a6d934561-v4-0-config-system-serving-cert\") pod \"oauth-openshift-759d657776-92g6w\" (UID: \"d914cab2-0c88-43b1-b563-627a6d934561\") " pod="openshift-authentication/oauth-openshift-759d657776-92g6w" Oct 11 03:57:20 crc kubenswrapper[4703]: I1011 03:57:20.145586 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/d914cab2-0c88-43b1-b563-627a6d934561-v4-0-config-user-template-login\") pod \"oauth-openshift-759d657776-92g6w\" (UID: \"d914cab2-0c88-43b1-b563-627a6d934561\") " pod="openshift-authentication/oauth-openshift-759d657776-92g6w" Oct 11 03:57:20 crc kubenswrapper[4703]: I1011 03:57:20.149448 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h25tf\" (UniqueName: \"kubernetes.io/projected/d914cab2-0c88-43b1-b563-627a6d934561-kube-api-access-h25tf\") pod \"oauth-openshift-759d657776-92g6w\" (UID: \"d914cab2-0c88-43b1-b563-627a6d934561\") " pod="openshift-authentication/oauth-openshift-759d657776-92g6w" Oct 11 03:57:20 crc kubenswrapper[4703]: I1011 03:57:20.304594 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-759d657776-92g6w" Oct 11 03:57:20 crc kubenswrapper[4703]: I1011 03:57:20.378851 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-bz6vn"] Oct 11 03:57:20 crc kubenswrapper[4703]: I1011 03:57:20.384215 4703 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-bz6vn"] Oct 11 03:57:20 crc kubenswrapper[4703]: I1011 03:57:20.824373 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-759d657776-92g6w"] Oct 11 03:57:21 crc kubenswrapper[4703]: I1011 03:57:21.037169 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-759d657776-92g6w" event={"ID":"d914cab2-0c88-43b1-b563-627a6d934561","Type":"ContainerStarted","Data":"0b81fe26549dc4bc77d94002ba5b4b99cc6928ecef263e87903ffe7c0d05f252"} Oct 11 03:57:21 crc kubenswrapper[4703]: I1011 03:57:21.546975 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="be7d6fb4-1943-4d7f-87a8-ce906ed31cf0" path="/var/lib/kubelet/pods/be7d6fb4-1943-4d7f-87a8-ce906ed31cf0/volumes" Oct 11 03:57:22 crc kubenswrapper[4703]: I1011 03:57:22.048490 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-759d657776-92g6w" event={"ID":"d914cab2-0c88-43b1-b563-627a6d934561","Type":"ContainerStarted","Data":"d993292b38cf004103abfebbf70db78a1d7d839a6517f9098f816f50faaa8eb7"} Oct 11 03:57:22 crc kubenswrapper[4703]: I1011 03:57:22.048934 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-759d657776-92g6w" Oct 11 03:57:22 crc kubenswrapper[4703]: I1011 03:57:22.061720 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-759d657776-92g6w" Oct 11 03:57:22 crc kubenswrapper[4703]: I1011 03:57:22.081379 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-759d657776-92g6w" podStartSLOduration=28.081346146 podStartE2EDuration="28.081346146s" podCreationTimestamp="2025-10-11 03:56:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 03:57:22.078500703 +0000 UTC m=+133.288982675" watchObservedRunningTime="2025-10-11 03:57:22.081346146 +0000 UTC m=+133.291828118" Oct 11 03:57:48 crc kubenswrapper[4703]: I1011 03:57:48.490382 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-z4mh8"] Oct 11 03:57:48 crc kubenswrapper[4703]: I1011 03:57:48.491996 4703 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-z4mh8" podUID="9301ceea-042d-4d8d-941c-54032cf80047" containerName="registry-server" containerID="cri-o://344e22c5362db32f5b336956b7fe988306536be9b2199dd5854ec9b215f291df" gracePeriod=30 Oct 11 03:57:48 crc kubenswrapper[4703]: I1011 03:57:48.505483 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-g6lsz"] Oct 11 03:57:48 crc kubenswrapper[4703]: I1011 03:57:48.505773 4703 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-g6lsz" podUID="27051689-7c44-4841-bcbc-7118a66ae0a0" containerName="registry-server" containerID="cri-o://f68ff242292996fab9da861be6260892926aea6c943e6117400696b059a4d3fb" gracePeriod=30 Oct 11 03:57:48 crc kubenswrapper[4703]: I1011 03:57:48.527619 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-tjd96"] Oct 11 03:57:48 crc kubenswrapper[4703]: I1011 03:57:48.528816 4703 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-tjd96" podUID="c7664ffa-e88d-468b-b43d-3b3b6ec5195b" containerName="marketplace-operator" containerID="cri-o://80d97fd1e1b79312dfd7c45fec2f0c1ed667fdead01a11b72e3cbf7c483d149c" gracePeriod=30 Oct 11 03:57:48 crc kubenswrapper[4703]: I1011 03:57:48.533708 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-qnqtn"] Oct 11 03:57:48 crc kubenswrapper[4703]: I1011 03:57:48.533933 4703 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-qnqtn" podUID="9307554f-a116-4c31-9585-c712558479fc" containerName="registry-server" containerID="cri-o://82274d9f2d29211be5e19353b40f52113c140b6f60d8ca4a92bd9505e648d6d3" gracePeriod=30 Oct 11 03:57:48 crc kubenswrapper[4703]: I1011 03:57:48.545257 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-n5sn7"] Oct 11 03:57:48 crc kubenswrapper[4703]: I1011 03:57:48.545589 4703 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-n5sn7" podUID="d85389b5-5e4e-4939-8f96-902c5a3ed9e2" containerName="registry-server" containerID="cri-o://ba047b40ea2fdd48079883611e6396257697c20ce640ee75a7ffeeee88c257dc" gracePeriod=30 Oct 11 03:57:48 crc kubenswrapper[4703]: I1011 03:57:48.555038 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-dfkzg"] Oct 11 03:57:48 crc kubenswrapper[4703]: I1011 03:57:48.556615 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-dfkzg" Oct 11 03:57:48 crc kubenswrapper[4703]: I1011 03:57:48.578616 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-dfkzg"] Oct 11 03:57:48 crc kubenswrapper[4703]: I1011 03:57:48.648319 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/c9b7ec35-94a9-42b2-b086-a5810df3acf3-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-dfkzg\" (UID: \"c9b7ec35-94a9-42b2-b086-a5810df3acf3\") " pod="openshift-marketplace/marketplace-operator-79b997595-dfkzg" Oct 11 03:57:48 crc kubenswrapper[4703]: I1011 03:57:48.648435 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m8fzb\" (UniqueName: \"kubernetes.io/projected/c9b7ec35-94a9-42b2-b086-a5810df3acf3-kube-api-access-m8fzb\") pod \"marketplace-operator-79b997595-dfkzg\" (UID: \"c9b7ec35-94a9-42b2-b086-a5810df3acf3\") " pod="openshift-marketplace/marketplace-operator-79b997595-dfkzg" Oct 11 03:57:48 crc kubenswrapper[4703]: I1011 03:57:48.648505 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c9b7ec35-94a9-42b2-b086-a5810df3acf3-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-dfkzg\" (UID: \"c9b7ec35-94a9-42b2-b086-a5810df3acf3\") " pod="openshift-marketplace/marketplace-operator-79b997595-dfkzg" Oct 11 03:57:48 crc kubenswrapper[4703]: I1011 03:57:48.749369 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c9b7ec35-94a9-42b2-b086-a5810df3acf3-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-dfkzg\" (UID: \"c9b7ec35-94a9-42b2-b086-a5810df3acf3\") " pod="openshift-marketplace/marketplace-operator-79b997595-dfkzg" Oct 11 03:57:48 crc kubenswrapper[4703]: I1011 03:57:48.749446 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/c9b7ec35-94a9-42b2-b086-a5810df3acf3-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-dfkzg\" (UID: \"c9b7ec35-94a9-42b2-b086-a5810df3acf3\") " pod="openshift-marketplace/marketplace-operator-79b997595-dfkzg" Oct 11 03:57:48 crc kubenswrapper[4703]: I1011 03:57:48.749505 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m8fzb\" (UniqueName: \"kubernetes.io/projected/c9b7ec35-94a9-42b2-b086-a5810df3acf3-kube-api-access-m8fzb\") pod \"marketplace-operator-79b997595-dfkzg\" (UID: \"c9b7ec35-94a9-42b2-b086-a5810df3acf3\") " pod="openshift-marketplace/marketplace-operator-79b997595-dfkzg" Oct 11 03:57:48 crc kubenswrapper[4703]: I1011 03:57:48.750859 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c9b7ec35-94a9-42b2-b086-a5810df3acf3-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-dfkzg\" (UID: \"c9b7ec35-94a9-42b2-b086-a5810df3acf3\") " pod="openshift-marketplace/marketplace-operator-79b997595-dfkzg" Oct 11 03:57:48 crc kubenswrapper[4703]: I1011 03:57:48.756538 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/c9b7ec35-94a9-42b2-b086-a5810df3acf3-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-dfkzg\" (UID: \"c9b7ec35-94a9-42b2-b086-a5810df3acf3\") " pod="openshift-marketplace/marketplace-operator-79b997595-dfkzg" Oct 11 03:57:48 crc kubenswrapper[4703]: I1011 03:57:48.777316 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m8fzb\" (UniqueName: \"kubernetes.io/projected/c9b7ec35-94a9-42b2-b086-a5810df3acf3-kube-api-access-m8fzb\") pod \"marketplace-operator-79b997595-dfkzg\" (UID: \"c9b7ec35-94a9-42b2-b086-a5810df3acf3\") " pod="openshift-marketplace/marketplace-operator-79b997595-dfkzg" Oct 11 03:57:48 crc kubenswrapper[4703]: I1011 03:57:48.873834 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-dfkzg" Oct 11 03:57:48 crc kubenswrapper[4703]: I1011 03:57:48.969309 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-z4mh8" Oct 11 03:57:49 crc kubenswrapper[4703]: I1011 03:57:49.003689 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n5sn7" Oct 11 03:57:49 crc kubenswrapper[4703]: I1011 03:57:49.012495 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-tjd96" Oct 11 03:57:49 crc kubenswrapper[4703]: I1011 03:57:49.037263 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-g6lsz" Oct 11 03:57:49 crc kubenswrapper[4703]: I1011 03:57:49.052779 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d85389b5-5e4e-4939-8f96-902c5a3ed9e2-catalog-content\") pod \"d85389b5-5e4e-4939-8f96-902c5a3ed9e2\" (UID: \"d85389b5-5e4e-4939-8f96-902c5a3ed9e2\") " Oct 11 03:57:49 crc kubenswrapper[4703]: I1011 03:57:49.052841 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qtg94\" (UniqueName: \"kubernetes.io/projected/c7664ffa-e88d-468b-b43d-3b3b6ec5195b-kube-api-access-qtg94\") pod \"c7664ffa-e88d-468b-b43d-3b3b6ec5195b\" (UID: \"c7664ffa-e88d-468b-b43d-3b3b6ec5195b\") " Oct 11 03:57:49 crc kubenswrapper[4703]: I1011 03:57:49.052888 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6z74n\" (UniqueName: \"kubernetes.io/projected/9301ceea-042d-4d8d-941c-54032cf80047-kube-api-access-6z74n\") pod \"9301ceea-042d-4d8d-941c-54032cf80047\" (UID: \"9301ceea-042d-4d8d-941c-54032cf80047\") " Oct 11 03:57:49 crc kubenswrapper[4703]: I1011 03:57:49.052931 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9301ceea-042d-4d8d-941c-54032cf80047-utilities\") pod \"9301ceea-042d-4d8d-941c-54032cf80047\" (UID: \"9301ceea-042d-4d8d-941c-54032cf80047\") " Oct 11 03:57:49 crc kubenswrapper[4703]: I1011 03:57:49.052951 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d85389b5-5e4e-4939-8f96-902c5a3ed9e2-utilities\") pod \"d85389b5-5e4e-4939-8f96-902c5a3ed9e2\" (UID: \"d85389b5-5e4e-4939-8f96-902c5a3ed9e2\") " Oct 11 03:57:49 crc kubenswrapper[4703]: I1011 03:57:49.052995 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c7664ffa-e88d-468b-b43d-3b3b6ec5195b-marketplace-trusted-ca\") pod \"c7664ffa-e88d-468b-b43d-3b3b6ec5195b\" (UID: \"c7664ffa-e88d-468b-b43d-3b3b6ec5195b\") " Oct 11 03:57:49 crc kubenswrapper[4703]: I1011 03:57:49.053030 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jwqgx\" (UniqueName: \"kubernetes.io/projected/d85389b5-5e4e-4939-8f96-902c5a3ed9e2-kube-api-access-jwqgx\") pod \"d85389b5-5e4e-4939-8f96-902c5a3ed9e2\" (UID: \"d85389b5-5e4e-4939-8f96-902c5a3ed9e2\") " Oct 11 03:57:49 crc kubenswrapper[4703]: I1011 03:57:49.053060 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/c7664ffa-e88d-468b-b43d-3b3b6ec5195b-marketplace-operator-metrics\") pod \"c7664ffa-e88d-468b-b43d-3b3b6ec5195b\" (UID: \"c7664ffa-e88d-468b-b43d-3b3b6ec5195b\") " Oct 11 03:57:49 crc kubenswrapper[4703]: I1011 03:57:49.053087 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9301ceea-042d-4d8d-941c-54032cf80047-catalog-content\") pod \"9301ceea-042d-4d8d-941c-54032cf80047\" (UID: \"9301ceea-042d-4d8d-941c-54032cf80047\") " Oct 11 03:57:49 crc kubenswrapper[4703]: I1011 03:57:49.059728 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c7664ffa-e88d-468b-b43d-3b3b6ec5195b-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "c7664ffa-e88d-468b-b43d-3b3b6ec5195b" (UID: "c7664ffa-e88d-468b-b43d-3b3b6ec5195b"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 03:57:49 crc kubenswrapper[4703]: I1011 03:57:49.060341 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9301ceea-042d-4d8d-941c-54032cf80047-utilities" (OuterVolumeSpecName: "utilities") pod "9301ceea-042d-4d8d-941c-54032cf80047" (UID: "9301ceea-042d-4d8d-941c-54032cf80047"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 03:57:49 crc kubenswrapper[4703]: I1011 03:57:49.065591 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7664ffa-e88d-468b-b43d-3b3b6ec5195b-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "c7664ffa-e88d-468b-b43d-3b3b6ec5195b" (UID: "c7664ffa-e88d-468b-b43d-3b3b6ec5195b"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 03:57:49 crc kubenswrapper[4703]: I1011 03:57:49.068342 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c7664ffa-e88d-468b-b43d-3b3b6ec5195b-kube-api-access-qtg94" (OuterVolumeSpecName: "kube-api-access-qtg94") pod "c7664ffa-e88d-468b-b43d-3b3b6ec5195b" (UID: "c7664ffa-e88d-468b-b43d-3b3b6ec5195b"). InnerVolumeSpecName "kube-api-access-qtg94". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 03:57:49 crc kubenswrapper[4703]: I1011 03:57:49.068567 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d85389b5-5e4e-4939-8f96-902c5a3ed9e2-kube-api-access-jwqgx" (OuterVolumeSpecName: "kube-api-access-jwqgx") pod "d85389b5-5e4e-4939-8f96-902c5a3ed9e2" (UID: "d85389b5-5e4e-4939-8f96-902c5a3ed9e2"). InnerVolumeSpecName "kube-api-access-jwqgx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 03:57:49 crc kubenswrapper[4703]: I1011 03:57:49.068751 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d85389b5-5e4e-4939-8f96-902c5a3ed9e2-utilities" (OuterVolumeSpecName: "utilities") pod "d85389b5-5e4e-4939-8f96-902c5a3ed9e2" (UID: "d85389b5-5e4e-4939-8f96-902c5a3ed9e2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 03:57:49 crc kubenswrapper[4703]: I1011 03:57:49.071675 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9301ceea-042d-4d8d-941c-54032cf80047-kube-api-access-6z74n" (OuterVolumeSpecName: "kube-api-access-6z74n") pod "9301ceea-042d-4d8d-941c-54032cf80047" (UID: "9301ceea-042d-4d8d-941c-54032cf80047"). InnerVolumeSpecName "kube-api-access-6z74n". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 03:57:49 crc kubenswrapper[4703]: I1011 03:57:49.092656 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qnqtn" Oct 11 03:57:49 crc kubenswrapper[4703]: I1011 03:57:49.126965 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9301ceea-042d-4d8d-941c-54032cf80047-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9301ceea-042d-4d8d-941c-54032cf80047" (UID: "9301ceea-042d-4d8d-941c-54032cf80047"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 03:57:49 crc kubenswrapper[4703]: I1011 03:57:49.153215 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d85389b5-5e4e-4939-8f96-902c5a3ed9e2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d85389b5-5e4e-4939-8f96-902c5a3ed9e2" (UID: "d85389b5-5e4e-4939-8f96-902c5a3ed9e2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 03:57:49 crc kubenswrapper[4703]: I1011 03:57:49.154379 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/27051689-7c44-4841-bcbc-7118a66ae0a0-catalog-content\") pod \"27051689-7c44-4841-bcbc-7118a66ae0a0\" (UID: \"27051689-7c44-4841-bcbc-7118a66ae0a0\") " Oct 11 03:57:49 crc kubenswrapper[4703]: I1011 03:57:49.154566 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9307554f-a116-4c31-9585-c712558479fc-catalog-content\") pod \"9307554f-a116-4c31-9585-c712558479fc\" (UID: \"9307554f-a116-4c31-9585-c712558479fc\") " Oct 11 03:57:49 crc kubenswrapper[4703]: I1011 03:57:49.154642 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9307554f-a116-4c31-9585-c712558479fc-utilities\") pod \"9307554f-a116-4c31-9585-c712558479fc\" (UID: \"9307554f-a116-4c31-9585-c712558479fc\") " Oct 11 03:57:49 crc kubenswrapper[4703]: I1011 03:57:49.154866 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-69q9q\" (UniqueName: \"kubernetes.io/projected/27051689-7c44-4841-bcbc-7118a66ae0a0-kube-api-access-69q9q\") pod \"27051689-7c44-4841-bcbc-7118a66ae0a0\" (UID: \"27051689-7c44-4841-bcbc-7118a66ae0a0\") " Oct 11 03:57:49 crc kubenswrapper[4703]: I1011 03:57:49.154910 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w5nx5\" (UniqueName: \"kubernetes.io/projected/9307554f-a116-4c31-9585-c712558479fc-kube-api-access-w5nx5\") pod \"9307554f-a116-4c31-9585-c712558479fc\" (UID: \"9307554f-a116-4c31-9585-c712558479fc\") " Oct 11 03:57:49 crc kubenswrapper[4703]: I1011 03:57:49.154937 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/27051689-7c44-4841-bcbc-7118a66ae0a0-utilities\") pod \"27051689-7c44-4841-bcbc-7118a66ae0a0\" (UID: \"27051689-7c44-4841-bcbc-7118a66ae0a0\") " Oct 11 03:57:49 crc kubenswrapper[4703]: I1011 03:57:49.155205 4703 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d85389b5-5e4e-4939-8f96-902c5a3ed9e2-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 11 03:57:49 crc kubenswrapper[4703]: I1011 03:57:49.155226 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qtg94\" (UniqueName: \"kubernetes.io/projected/c7664ffa-e88d-468b-b43d-3b3b6ec5195b-kube-api-access-qtg94\") on node \"crc\" DevicePath \"\"" Oct 11 03:57:49 crc kubenswrapper[4703]: I1011 03:57:49.155241 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6z74n\" (UniqueName: \"kubernetes.io/projected/9301ceea-042d-4d8d-941c-54032cf80047-kube-api-access-6z74n\") on node \"crc\" DevicePath \"\"" Oct 11 03:57:49 crc kubenswrapper[4703]: I1011 03:57:49.155258 4703 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9301ceea-042d-4d8d-941c-54032cf80047-utilities\") on node \"crc\" DevicePath \"\"" Oct 11 03:57:49 crc kubenswrapper[4703]: I1011 03:57:49.155272 4703 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d85389b5-5e4e-4939-8f96-902c5a3ed9e2-utilities\") on node \"crc\" DevicePath \"\"" Oct 11 03:57:49 crc kubenswrapper[4703]: I1011 03:57:49.155284 4703 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c7664ffa-e88d-468b-b43d-3b3b6ec5195b-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 11 03:57:49 crc kubenswrapper[4703]: I1011 03:57:49.155296 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jwqgx\" (UniqueName: \"kubernetes.io/projected/d85389b5-5e4e-4939-8f96-902c5a3ed9e2-kube-api-access-jwqgx\") on node \"crc\" DevicePath \"\"" Oct 11 03:57:49 crc kubenswrapper[4703]: I1011 03:57:49.155307 4703 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/c7664ffa-e88d-468b-b43d-3b3b6ec5195b-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Oct 11 03:57:49 crc kubenswrapper[4703]: I1011 03:57:49.155319 4703 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9301ceea-042d-4d8d-941c-54032cf80047-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 11 03:57:49 crc kubenswrapper[4703]: I1011 03:57:49.157032 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/27051689-7c44-4841-bcbc-7118a66ae0a0-utilities" (OuterVolumeSpecName: "utilities") pod "27051689-7c44-4841-bcbc-7118a66ae0a0" (UID: "27051689-7c44-4841-bcbc-7118a66ae0a0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 03:57:49 crc kubenswrapper[4703]: I1011 03:57:49.158379 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9307554f-a116-4c31-9585-c712558479fc-utilities" (OuterVolumeSpecName: "utilities") pod "9307554f-a116-4c31-9585-c712558479fc" (UID: "9307554f-a116-4c31-9585-c712558479fc"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 03:57:49 crc kubenswrapper[4703]: I1011 03:57:49.160432 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/27051689-7c44-4841-bcbc-7118a66ae0a0-kube-api-access-69q9q" (OuterVolumeSpecName: "kube-api-access-69q9q") pod "27051689-7c44-4841-bcbc-7118a66ae0a0" (UID: "27051689-7c44-4841-bcbc-7118a66ae0a0"). InnerVolumeSpecName "kube-api-access-69q9q". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 03:57:49 crc kubenswrapper[4703]: I1011 03:57:49.160919 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9307554f-a116-4c31-9585-c712558479fc-kube-api-access-w5nx5" (OuterVolumeSpecName: "kube-api-access-w5nx5") pod "9307554f-a116-4c31-9585-c712558479fc" (UID: "9307554f-a116-4c31-9585-c712558479fc"). InnerVolumeSpecName "kube-api-access-w5nx5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 03:57:49 crc kubenswrapper[4703]: I1011 03:57:49.172675 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9307554f-a116-4c31-9585-c712558479fc-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9307554f-a116-4c31-9585-c712558479fc" (UID: "9307554f-a116-4c31-9585-c712558479fc"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 03:57:49 crc kubenswrapper[4703]: I1011 03:57:49.205765 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/27051689-7c44-4841-bcbc-7118a66ae0a0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "27051689-7c44-4841-bcbc-7118a66ae0a0" (UID: "27051689-7c44-4841-bcbc-7118a66ae0a0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 03:57:49 crc kubenswrapper[4703]: I1011 03:57:49.245684 4703 generic.go:334] "Generic (PLEG): container finished" podID="9307554f-a116-4c31-9585-c712558479fc" containerID="82274d9f2d29211be5e19353b40f52113c140b6f60d8ca4a92bd9505e648d6d3" exitCode=0 Oct 11 03:57:49 crc kubenswrapper[4703]: I1011 03:57:49.245862 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qnqtn" Oct 11 03:57:49 crc kubenswrapper[4703]: I1011 03:57:49.245890 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qnqtn" event={"ID":"9307554f-a116-4c31-9585-c712558479fc","Type":"ContainerDied","Data":"82274d9f2d29211be5e19353b40f52113c140b6f60d8ca4a92bd9505e648d6d3"} Oct 11 03:57:49 crc kubenswrapper[4703]: I1011 03:57:49.245979 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qnqtn" event={"ID":"9307554f-a116-4c31-9585-c712558479fc","Type":"ContainerDied","Data":"02d38393516d074048165b50fc80e058f6037a52304aea6e369644d4bbc6cfbe"} Oct 11 03:57:49 crc kubenswrapper[4703]: I1011 03:57:49.246006 4703 scope.go:117] "RemoveContainer" containerID="82274d9f2d29211be5e19353b40f52113c140b6f60d8ca4a92bd9505e648d6d3" Oct 11 03:57:49 crc kubenswrapper[4703]: I1011 03:57:49.250602 4703 generic.go:334] "Generic (PLEG): container finished" podID="d85389b5-5e4e-4939-8f96-902c5a3ed9e2" containerID="ba047b40ea2fdd48079883611e6396257697c20ce640ee75a7ffeeee88c257dc" exitCode=0 Oct 11 03:57:49 crc kubenswrapper[4703]: I1011 03:57:49.250643 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n5sn7" event={"ID":"d85389b5-5e4e-4939-8f96-902c5a3ed9e2","Type":"ContainerDied","Data":"ba047b40ea2fdd48079883611e6396257697c20ce640ee75a7ffeeee88c257dc"} Oct 11 03:57:49 crc kubenswrapper[4703]: I1011 03:57:49.250678 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n5sn7" event={"ID":"d85389b5-5e4e-4939-8f96-902c5a3ed9e2","Type":"ContainerDied","Data":"f21f8d8a91f7e64a365ba63004c6223aba654e8e41fd0121a5ed8c0706b7547f"} Oct 11 03:57:49 crc kubenswrapper[4703]: I1011 03:57:49.250625 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n5sn7" Oct 11 03:57:49 crc kubenswrapper[4703]: I1011 03:57:49.253897 4703 generic.go:334] "Generic (PLEG): container finished" podID="c7664ffa-e88d-468b-b43d-3b3b6ec5195b" containerID="80d97fd1e1b79312dfd7c45fec2f0c1ed667fdead01a11b72e3cbf7c483d149c" exitCode=0 Oct 11 03:57:49 crc kubenswrapper[4703]: I1011 03:57:49.254009 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-tjd96" Oct 11 03:57:49 crc kubenswrapper[4703]: I1011 03:57:49.254011 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-tjd96" event={"ID":"c7664ffa-e88d-468b-b43d-3b3b6ec5195b","Type":"ContainerDied","Data":"80d97fd1e1b79312dfd7c45fec2f0c1ed667fdead01a11b72e3cbf7c483d149c"} Oct 11 03:57:49 crc kubenswrapper[4703]: I1011 03:57:49.254189 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-tjd96" event={"ID":"c7664ffa-e88d-468b-b43d-3b3b6ec5195b","Type":"ContainerDied","Data":"62f1f6e130fc31d991ac4dbcdd87ba1e3fb683ff9d40686001eeaa2c5906c4d6"} Oct 11 03:57:49 crc kubenswrapper[4703]: I1011 03:57:49.257826 4703 generic.go:334] "Generic (PLEG): container finished" podID="27051689-7c44-4841-bcbc-7118a66ae0a0" containerID="f68ff242292996fab9da861be6260892926aea6c943e6117400696b059a4d3fb" exitCode=0 Oct 11 03:57:49 crc kubenswrapper[4703]: I1011 03:57:49.257897 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-g6lsz" event={"ID":"27051689-7c44-4841-bcbc-7118a66ae0a0","Type":"ContainerDied","Data":"f68ff242292996fab9da861be6260892926aea6c943e6117400696b059a4d3fb"} Oct 11 03:57:49 crc kubenswrapper[4703]: I1011 03:57:49.257915 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-g6lsz" event={"ID":"27051689-7c44-4841-bcbc-7118a66ae0a0","Type":"ContainerDied","Data":"3d332cac8f64275427f261d7221179b27e911f573168660eebb57a54980025a7"} Oct 11 03:57:49 crc kubenswrapper[4703]: I1011 03:57:49.257999 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-g6lsz" Oct 11 03:57:49 crc kubenswrapper[4703]: I1011 03:57:49.260252 4703 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9307554f-a116-4c31-9585-c712558479fc-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 11 03:57:49 crc kubenswrapper[4703]: I1011 03:57:49.260279 4703 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9307554f-a116-4c31-9585-c712558479fc-utilities\") on node \"crc\" DevicePath \"\"" Oct 11 03:57:49 crc kubenswrapper[4703]: I1011 03:57:49.260293 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-69q9q\" (UniqueName: \"kubernetes.io/projected/27051689-7c44-4841-bcbc-7118a66ae0a0-kube-api-access-69q9q\") on node \"crc\" DevicePath \"\"" Oct 11 03:57:49 crc kubenswrapper[4703]: I1011 03:57:49.260309 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w5nx5\" (UniqueName: \"kubernetes.io/projected/9307554f-a116-4c31-9585-c712558479fc-kube-api-access-w5nx5\") on node \"crc\" DevicePath \"\"" Oct 11 03:57:49 crc kubenswrapper[4703]: I1011 03:57:49.260323 4703 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/27051689-7c44-4841-bcbc-7118a66ae0a0-utilities\") on node \"crc\" DevicePath \"\"" Oct 11 03:57:49 crc kubenswrapper[4703]: I1011 03:57:49.260337 4703 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/27051689-7c44-4841-bcbc-7118a66ae0a0-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 11 03:57:49 crc kubenswrapper[4703]: I1011 03:57:49.261363 4703 generic.go:334] "Generic (PLEG): container finished" podID="9301ceea-042d-4d8d-941c-54032cf80047" containerID="344e22c5362db32f5b336956b7fe988306536be9b2199dd5854ec9b215f291df" exitCode=0 Oct 11 03:57:49 crc kubenswrapper[4703]: I1011 03:57:49.261402 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-z4mh8" event={"ID":"9301ceea-042d-4d8d-941c-54032cf80047","Type":"ContainerDied","Data":"344e22c5362db32f5b336956b7fe988306536be9b2199dd5854ec9b215f291df"} Oct 11 03:57:49 crc kubenswrapper[4703]: I1011 03:57:49.261428 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-z4mh8" Oct 11 03:57:49 crc kubenswrapper[4703]: I1011 03:57:49.261431 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-z4mh8" event={"ID":"9301ceea-042d-4d8d-941c-54032cf80047","Type":"ContainerDied","Data":"0f28f57e7dd1a444ac576b47c3ed1c53ba3015f4b48f8487a9951ddc0db1e252"} Oct 11 03:57:49 crc kubenswrapper[4703]: I1011 03:57:49.266581 4703 scope.go:117] "RemoveContainer" containerID="28a04f02e0b3d54fc35d46f6ee66ea8fff2bae3deda09d586d27358ec4a4a2ce" Oct 11 03:57:49 crc kubenswrapper[4703]: I1011 03:57:49.295300 4703 scope.go:117] "RemoveContainer" containerID="1cd0e4c1ea7ee1bee94aa85f64857d65d2ad9699bb41abd3f98995c2bc634241" Oct 11 03:57:49 crc kubenswrapper[4703]: I1011 03:57:49.310776 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-n5sn7"] Oct 11 03:57:49 crc kubenswrapper[4703]: I1011 03:57:49.316199 4703 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-n5sn7"] Oct 11 03:57:49 crc kubenswrapper[4703]: I1011 03:57:49.325151 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-g6lsz"] Oct 11 03:57:49 crc kubenswrapper[4703]: I1011 03:57:49.330665 4703 scope.go:117] "RemoveContainer" containerID="82274d9f2d29211be5e19353b40f52113c140b6f60d8ca4a92bd9505e648d6d3" Oct 11 03:57:49 crc kubenswrapper[4703]: E1011 03:57:49.332828 4703 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"82274d9f2d29211be5e19353b40f52113c140b6f60d8ca4a92bd9505e648d6d3\": container with ID starting with 82274d9f2d29211be5e19353b40f52113c140b6f60d8ca4a92bd9505e648d6d3 not found: ID does not exist" containerID="82274d9f2d29211be5e19353b40f52113c140b6f60d8ca4a92bd9505e648d6d3" Oct 11 03:57:49 crc kubenswrapper[4703]: I1011 03:57:49.332869 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"82274d9f2d29211be5e19353b40f52113c140b6f60d8ca4a92bd9505e648d6d3"} err="failed to get container status \"82274d9f2d29211be5e19353b40f52113c140b6f60d8ca4a92bd9505e648d6d3\": rpc error: code = NotFound desc = could not find container \"82274d9f2d29211be5e19353b40f52113c140b6f60d8ca4a92bd9505e648d6d3\": container with ID starting with 82274d9f2d29211be5e19353b40f52113c140b6f60d8ca4a92bd9505e648d6d3 not found: ID does not exist" Oct 11 03:57:49 crc kubenswrapper[4703]: I1011 03:57:49.332893 4703 scope.go:117] "RemoveContainer" containerID="28a04f02e0b3d54fc35d46f6ee66ea8fff2bae3deda09d586d27358ec4a4a2ce" Oct 11 03:57:49 crc kubenswrapper[4703]: E1011 03:57:49.334295 4703 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"28a04f02e0b3d54fc35d46f6ee66ea8fff2bae3deda09d586d27358ec4a4a2ce\": container with ID starting with 28a04f02e0b3d54fc35d46f6ee66ea8fff2bae3deda09d586d27358ec4a4a2ce not found: ID does not exist" containerID="28a04f02e0b3d54fc35d46f6ee66ea8fff2bae3deda09d586d27358ec4a4a2ce" Oct 11 03:57:49 crc kubenswrapper[4703]: I1011 03:57:49.334313 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"28a04f02e0b3d54fc35d46f6ee66ea8fff2bae3deda09d586d27358ec4a4a2ce"} err="failed to get container status \"28a04f02e0b3d54fc35d46f6ee66ea8fff2bae3deda09d586d27358ec4a4a2ce\": rpc error: code = NotFound desc = could not find container \"28a04f02e0b3d54fc35d46f6ee66ea8fff2bae3deda09d586d27358ec4a4a2ce\": container with ID starting with 28a04f02e0b3d54fc35d46f6ee66ea8fff2bae3deda09d586d27358ec4a4a2ce not found: ID does not exist" Oct 11 03:57:49 crc kubenswrapper[4703]: I1011 03:57:49.334329 4703 scope.go:117] "RemoveContainer" containerID="1cd0e4c1ea7ee1bee94aa85f64857d65d2ad9699bb41abd3f98995c2bc634241" Oct 11 03:57:49 crc kubenswrapper[4703]: E1011 03:57:49.334815 4703 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1cd0e4c1ea7ee1bee94aa85f64857d65d2ad9699bb41abd3f98995c2bc634241\": container with ID starting with 1cd0e4c1ea7ee1bee94aa85f64857d65d2ad9699bb41abd3f98995c2bc634241 not found: ID does not exist" containerID="1cd0e4c1ea7ee1bee94aa85f64857d65d2ad9699bb41abd3f98995c2bc634241" Oct 11 03:57:49 crc kubenswrapper[4703]: I1011 03:57:49.334844 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1cd0e4c1ea7ee1bee94aa85f64857d65d2ad9699bb41abd3f98995c2bc634241"} err="failed to get container status \"1cd0e4c1ea7ee1bee94aa85f64857d65d2ad9699bb41abd3f98995c2bc634241\": rpc error: code = NotFound desc = could not find container \"1cd0e4c1ea7ee1bee94aa85f64857d65d2ad9699bb41abd3f98995c2bc634241\": container with ID starting with 1cd0e4c1ea7ee1bee94aa85f64857d65d2ad9699bb41abd3f98995c2bc634241 not found: ID does not exist" Oct 11 03:57:49 crc kubenswrapper[4703]: I1011 03:57:49.334860 4703 scope.go:117] "RemoveContainer" containerID="ba047b40ea2fdd48079883611e6396257697c20ce640ee75a7ffeeee88c257dc" Oct 11 03:57:49 crc kubenswrapper[4703]: I1011 03:57:49.335088 4703 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-g6lsz"] Oct 11 03:57:49 crc kubenswrapper[4703]: I1011 03:57:49.337664 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-tjd96"] Oct 11 03:57:49 crc kubenswrapper[4703]: I1011 03:57:49.345581 4703 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-tjd96"] Oct 11 03:57:49 crc kubenswrapper[4703]: I1011 03:57:49.352302 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-qnqtn"] Oct 11 03:57:49 crc kubenswrapper[4703]: I1011 03:57:49.355010 4703 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-qnqtn"] Oct 11 03:57:49 crc kubenswrapper[4703]: I1011 03:57:49.356720 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-z4mh8"] Oct 11 03:57:49 crc kubenswrapper[4703]: I1011 03:57:49.358747 4703 scope.go:117] "RemoveContainer" containerID="79f7936089ee3ef033f2ac88ab30434456328c92ba0955feea205f3979324a28" Oct 11 03:57:49 crc kubenswrapper[4703]: I1011 03:57:49.361370 4703 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-z4mh8"] Oct 11 03:57:49 crc kubenswrapper[4703]: I1011 03:57:49.371353 4703 scope.go:117] "RemoveContainer" containerID="996e01e05a22e6cfbffdb340157f7e2c20ddfc5cfa2fb8fdef34e3dd16c8603c" Oct 11 03:57:49 crc kubenswrapper[4703]: I1011 03:57:49.384245 4703 scope.go:117] "RemoveContainer" containerID="ba047b40ea2fdd48079883611e6396257697c20ce640ee75a7ffeeee88c257dc" Oct 11 03:57:49 crc kubenswrapper[4703]: E1011 03:57:49.384681 4703 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ba047b40ea2fdd48079883611e6396257697c20ce640ee75a7ffeeee88c257dc\": container with ID starting with ba047b40ea2fdd48079883611e6396257697c20ce640ee75a7ffeeee88c257dc not found: ID does not exist" containerID="ba047b40ea2fdd48079883611e6396257697c20ce640ee75a7ffeeee88c257dc" Oct 11 03:57:49 crc kubenswrapper[4703]: I1011 03:57:49.384708 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ba047b40ea2fdd48079883611e6396257697c20ce640ee75a7ffeeee88c257dc"} err="failed to get container status \"ba047b40ea2fdd48079883611e6396257697c20ce640ee75a7ffeeee88c257dc\": rpc error: code = NotFound desc = could not find container \"ba047b40ea2fdd48079883611e6396257697c20ce640ee75a7ffeeee88c257dc\": container with ID starting with ba047b40ea2fdd48079883611e6396257697c20ce640ee75a7ffeeee88c257dc not found: ID does not exist" Oct 11 03:57:49 crc kubenswrapper[4703]: I1011 03:57:49.384732 4703 scope.go:117] "RemoveContainer" containerID="79f7936089ee3ef033f2ac88ab30434456328c92ba0955feea205f3979324a28" Oct 11 03:57:49 crc kubenswrapper[4703]: E1011 03:57:49.385054 4703 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"79f7936089ee3ef033f2ac88ab30434456328c92ba0955feea205f3979324a28\": container with ID starting with 79f7936089ee3ef033f2ac88ab30434456328c92ba0955feea205f3979324a28 not found: ID does not exist" containerID="79f7936089ee3ef033f2ac88ab30434456328c92ba0955feea205f3979324a28" Oct 11 03:57:49 crc kubenswrapper[4703]: I1011 03:57:49.385080 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"79f7936089ee3ef033f2ac88ab30434456328c92ba0955feea205f3979324a28"} err="failed to get container status \"79f7936089ee3ef033f2ac88ab30434456328c92ba0955feea205f3979324a28\": rpc error: code = NotFound desc = could not find container \"79f7936089ee3ef033f2ac88ab30434456328c92ba0955feea205f3979324a28\": container with ID starting with 79f7936089ee3ef033f2ac88ab30434456328c92ba0955feea205f3979324a28 not found: ID does not exist" Oct 11 03:57:49 crc kubenswrapper[4703]: I1011 03:57:49.385097 4703 scope.go:117] "RemoveContainer" containerID="996e01e05a22e6cfbffdb340157f7e2c20ddfc5cfa2fb8fdef34e3dd16c8603c" Oct 11 03:57:49 crc kubenswrapper[4703]: E1011 03:57:49.385402 4703 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"996e01e05a22e6cfbffdb340157f7e2c20ddfc5cfa2fb8fdef34e3dd16c8603c\": container with ID starting with 996e01e05a22e6cfbffdb340157f7e2c20ddfc5cfa2fb8fdef34e3dd16c8603c not found: ID does not exist" containerID="996e01e05a22e6cfbffdb340157f7e2c20ddfc5cfa2fb8fdef34e3dd16c8603c" Oct 11 03:57:49 crc kubenswrapper[4703]: I1011 03:57:49.385420 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"996e01e05a22e6cfbffdb340157f7e2c20ddfc5cfa2fb8fdef34e3dd16c8603c"} err="failed to get container status \"996e01e05a22e6cfbffdb340157f7e2c20ddfc5cfa2fb8fdef34e3dd16c8603c\": rpc error: code = NotFound desc = could not find container \"996e01e05a22e6cfbffdb340157f7e2c20ddfc5cfa2fb8fdef34e3dd16c8603c\": container with ID starting with 996e01e05a22e6cfbffdb340157f7e2c20ddfc5cfa2fb8fdef34e3dd16c8603c not found: ID does not exist" Oct 11 03:57:49 crc kubenswrapper[4703]: I1011 03:57:49.385433 4703 scope.go:117] "RemoveContainer" containerID="80d97fd1e1b79312dfd7c45fec2f0c1ed667fdead01a11b72e3cbf7c483d149c" Oct 11 03:57:49 crc kubenswrapper[4703]: I1011 03:57:49.396523 4703 scope.go:117] "RemoveContainer" containerID="80d97fd1e1b79312dfd7c45fec2f0c1ed667fdead01a11b72e3cbf7c483d149c" Oct 11 03:57:49 crc kubenswrapper[4703]: E1011 03:57:49.397060 4703 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"80d97fd1e1b79312dfd7c45fec2f0c1ed667fdead01a11b72e3cbf7c483d149c\": container with ID starting with 80d97fd1e1b79312dfd7c45fec2f0c1ed667fdead01a11b72e3cbf7c483d149c not found: ID does not exist" containerID="80d97fd1e1b79312dfd7c45fec2f0c1ed667fdead01a11b72e3cbf7c483d149c" Oct 11 03:57:49 crc kubenswrapper[4703]: I1011 03:57:49.397105 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"80d97fd1e1b79312dfd7c45fec2f0c1ed667fdead01a11b72e3cbf7c483d149c"} err="failed to get container status \"80d97fd1e1b79312dfd7c45fec2f0c1ed667fdead01a11b72e3cbf7c483d149c\": rpc error: code = NotFound desc = could not find container \"80d97fd1e1b79312dfd7c45fec2f0c1ed667fdead01a11b72e3cbf7c483d149c\": container with ID starting with 80d97fd1e1b79312dfd7c45fec2f0c1ed667fdead01a11b72e3cbf7c483d149c not found: ID does not exist" Oct 11 03:57:49 crc kubenswrapper[4703]: I1011 03:57:49.397142 4703 scope.go:117] "RemoveContainer" containerID="f68ff242292996fab9da861be6260892926aea6c943e6117400696b059a4d3fb" Oct 11 03:57:49 crc kubenswrapper[4703]: I1011 03:57:49.407405 4703 scope.go:117] "RemoveContainer" containerID="ad876d8e6cd6fd289d1cc2b8fdb677c15de192e4e7c5c27db757451a97f5953f" Oct 11 03:57:49 crc kubenswrapper[4703]: I1011 03:57:49.419375 4703 scope.go:117] "RemoveContainer" containerID="2570a196676290333ca5f8845814cc698f4b05ea074208f037e9c1e1c5a60f5a" Oct 11 03:57:49 crc kubenswrapper[4703]: I1011 03:57:49.430537 4703 scope.go:117] "RemoveContainer" containerID="f68ff242292996fab9da861be6260892926aea6c943e6117400696b059a4d3fb" Oct 11 03:57:49 crc kubenswrapper[4703]: E1011 03:57:49.431666 4703 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f68ff242292996fab9da861be6260892926aea6c943e6117400696b059a4d3fb\": container with ID starting with f68ff242292996fab9da861be6260892926aea6c943e6117400696b059a4d3fb not found: ID does not exist" containerID="f68ff242292996fab9da861be6260892926aea6c943e6117400696b059a4d3fb" Oct 11 03:57:49 crc kubenswrapper[4703]: I1011 03:57:49.431709 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f68ff242292996fab9da861be6260892926aea6c943e6117400696b059a4d3fb"} err="failed to get container status \"f68ff242292996fab9da861be6260892926aea6c943e6117400696b059a4d3fb\": rpc error: code = NotFound desc = could not find container \"f68ff242292996fab9da861be6260892926aea6c943e6117400696b059a4d3fb\": container with ID starting with f68ff242292996fab9da861be6260892926aea6c943e6117400696b059a4d3fb not found: ID does not exist" Oct 11 03:57:49 crc kubenswrapper[4703]: I1011 03:57:49.431745 4703 scope.go:117] "RemoveContainer" containerID="ad876d8e6cd6fd289d1cc2b8fdb677c15de192e4e7c5c27db757451a97f5953f" Oct 11 03:57:49 crc kubenswrapper[4703]: E1011 03:57:49.432055 4703 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ad876d8e6cd6fd289d1cc2b8fdb677c15de192e4e7c5c27db757451a97f5953f\": container with ID starting with ad876d8e6cd6fd289d1cc2b8fdb677c15de192e4e7c5c27db757451a97f5953f not found: ID does not exist" containerID="ad876d8e6cd6fd289d1cc2b8fdb677c15de192e4e7c5c27db757451a97f5953f" Oct 11 03:57:49 crc kubenswrapper[4703]: I1011 03:57:49.432086 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ad876d8e6cd6fd289d1cc2b8fdb677c15de192e4e7c5c27db757451a97f5953f"} err="failed to get container status \"ad876d8e6cd6fd289d1cc2b8fdb677c15de192e4e7c5c27db757451a97f5953f\": rpc error: code = NotFound desc = could not find container \"ad876d8e6cd6fd289d1cc2b8fdb677c15de192e4e7c5c27db757451a97f5953f\": container with ID starting with ad876d8e6cd6fd289d1cc2b8fdb677c15de192e4e7c5c27db757451a97f5953f not found: ID does not exist" Oct 11 03:57:49 crc kubenswrapper[4703]: I1011 03:57:49.432112 4703 scope.go:117] "RemoveContainer" containerID="2570a196676290333ca5f8845814cc698f4b05ea074208f037e9c1e1c5a60f5a" Oct 11 03:57:49 crc kubenswrapper[4703]: E1011 03:57:49.432403 4703 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2570a196676290333ca5f8845814cc698f4b05ea074208f037e9c1e1c5a60f5a\": container with ID starting with 2570a196676290333ca5f8845814cc698f4b05ea074208f037e9c1e1c5a60f5a not found: ID does not exist" containerID="2570a196676290333ca5f8845814cc698f4b05ea074208f037e9c1e1c5a60f5a" Oct 11 03:57:49 crc kubenswrapper[4703]: I1011 03:57:49.432480 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2570a196676290333ca5f8845814cc698f4b05ea074208f037e9c1e1c5a60f5a"} err="failed to get container status \"2570a196676290333ca5f8845814cc698f4b05ea074208f037e9c1e1c5a60f5a\": rpc error: code = NotFound desc = could not find container \"2570a196676290333ca5f8845814cc698f4b05ea074208f037e9c1e1c5a60f5a\": container with ID starting with 2570a196676290333ca5f8845814cc698f4b05ea074208f037e9c1e1c5a60f5a not found: ID does not exist" Oct 11 03:57:49 crc kubenswrapper[4703]: I1011 03:57:49.432514 4703 scope.go:117] "RemoveContainer" containerID="344e22c5362db32f5b336956b7fe988306536be9b2199dd5854ec9b215f291df" Oct 11 03:57:49 crc kubenswrapper[4703]: I1011 03:57:49.442671 4703 scope.go:117] "RemoveContainer" containerID="eafc0e378d62f16634572dbc920a814486e8475f3588700a976f08e71c2c0a34" Oct 11 03:57:49 crc kubenswrapper[4703]: I1011 03:57:49.455524 4703 scope.go:117] "RemoveContainer" containerID="b15f15e4c0b3ad19a7ca6242c3c99e67a98b48a9e68c55c5bd344a5c92c50642" Oct 11 03:57:49 crc kubenswrapper[4703]: I1011 03:57:49.470879 4703 scope.go:117] "RemoveContainer" containerID="344e22c5362db32f5b336956b7fe988306536be9b2199dd5854ec9b215f291df" Oct 11 03:57:49 crc kubenswrapper[4703]: E1011 03:57:49.471293 4703 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"344e22c5362db32f5b336956b7fe988306536be9b2199dd5854ec9b215f291df\": container with ID starting with 344e22c5362db32f5b336956b7fe988306536be9b2199dd5854ec9b215f291df not found: ID does not exist" containerID="344e22c5362db32f5b336956b7fe988306536be9b2199dd5854ec9b215f291df" Oct 11 03:57:49 crc kubenswrapper[4703]: I1011 03:57:49.471336 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"344e22c5362db32f5b336956b7fe988306536be9b2199dd5854ec9b215f291df"} err="failed to get container status \"344e22c5362db32f5b336956b7fe988306536be9b2199dd5854ec9b215f291df\": rpc error: code = NotFound desc = could not find container \"344e22c5362db32f5b336956b7fe988306536be9b2199dd5854ec9b215f291df\": container with ID starting with 344e22c5362db32f5b336956b7fe988306536be9b2199dd5854ec9b215f291df not found: ID does not exist" Oct 11 03:57:49 crc kubenswrapper[4703]: I1011 03:57:49.471366 4703 scope.go:117] "RemoveContainer" containerID="eafc0e378d62f16634572dbc920a814486e8475f3588700a976f08e71c2c0a34" Oct 11 03:57:49 crc kubenswrapper[4703]: E1011 03:57:49.472941 4703 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eafc0e378d62f16634572dbc920a814486e8475f3588700a976f08e71c2c0a34\": container with ID starting with eafc0e378d62f16634572dbc920a814486e8475f3588700a976f08e71c2c0a34 not found: ID does not exist" containerID="eafc0e378d62f16634572dbc920a814486e8475f3588700a976f08e71c2c0a34" Oct 11 03:57:49 crc kubenswrapper[4703]: I1011 03:57:49.472988 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eafc0e378d62f16634572dbc920a814486e8475f3588700a976f08e71c2c0a34"} err="failed to get container status \"eafc0e378d62f16634572dbc920a814486e8475f3588700a976f08e71c2c0a34\": rpc error: code = NotFound desc = could not find container \"eafc0e378d62f16634572dbc920a814486e8475f3588700a976f08e71c2c0a34\": container with ID starting with eafc0e378d62f16634572dbc920a814486e8475f3588700a976f08e71c2c0a34 not found: ID does not exist" Oct 11 03:57:49 crc kubenswrapper[4703]: I1011 03:57:49.473018 4703 scope.go:117] "RemoveContainer" containerID="b15f15e4c0b3ad19a7ca6242c3c99e67a98b48a9e68c55c5bd344a5c92c50642" Oct 11 03:57:49 crc kubenswrapper[4703]: E1011 03:57:49.473460 4703 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b15f15e4c0b3ad19a7ca6242c3c99e67a98b48a9e68c55c5bd344a5c92c50642\": container with ID starting with b15f15e4c0b3ad19a7ca6242c3c99e67a98b48a9e68c55c5bd344a5c92c50642 not found: ID does not exist" containerID="b15f15e4c0b3ad19a7ca6242c3c99e67a98b48a9e68c55c5bd344a5c92c50642" Oct 11 03:57:49 crc kubenswrapper[4703]: I1011 03:57:49.473514 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b15f15e4c0b3ad19a7ca6242c3c99e67a98b48a9e68c55c5bd344a5c92c50642"} err="failed to get container status \"b15f15e4c0b3ad19a7ca6242c3c99e67a98b48a9e68c55c5bd344a5c92c50642\": rpc error: code = NotFound desc = could not find container \"b15f15e4c0b3ad19a7ca6242c3c99e67a98b48a9e68c55c5bd344a5c92c50642\": container with ID starting with b15f15e4c0b3ad19a7ca6242c3c99e67a98b48a9e68c55c5bd344a5c92c50642 not found: ID does not exist" Oct 11 03:57:49 crc kubenswrapper[4703]: I1011 03:57:49.548028 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="27051689-7c44-4841-bcbc-7118a66ae0a0" path="/var/lib/kubelet/pods/27051689-7c44-4841-bcbc-7118a66ae0a0/volumes" Oct 11 03:57:49 crc kubenswrapper[4703]: I1011 03:57:49.548768 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9301ceea-042d-4d8d-941c-54032cf80047" path="/var/lib/kubelet/pods/9301ceea-042d-4d8d-941c-54032cf80047/volumes" Oct 11 03:57:49 crc kubenswrapper[4703]: I1011 03:57:49.549431 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9307554f-a116-4c31-9585-c712558479fc" path="/var/lib/kubelet/pods/9307554f-a116-4c31-9585-c712558479fc/volumes" Oct 11 03:57:49 crc kubenswrapper[4703]: I1011 03:57:49.550840 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c7664ffa-e88d-468b-b43d-3b3b6ec5195b" path="/var/lib/kubelet/pods/c7664ffa-e88d-468b-b43d-3b3b6ec5195b/volumes" Oct 11 03:57:49 crc kubenswrapper[4703]: I1011 03:57:49.551616 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d85389b5-5e4e-4939-8f96-902c5a3ed9e2" path="/var/lib/kubelet/pods/d85389b5-5e4e-4939-8f96-902c5a3ed9e2/volumes" Oct 11 03:57:49 crc kubenswrapper[4703]: I1011 03:57:49.664234 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-dfkzg"] Oct 11 03:57:50 crc kubenswrapper[4703]: I1011 03:57:50.255401 4703 patch_prober.go:28] interesting pod/machine-config-daemon-6b7d5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 11 03:57:50 crc kubenswrapper[4703]: I1011 03:57:50.255521 4703 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6b7d5" podUID="74f832fc-6791-47d6-a9b3-07d923e053dc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 11 03:57:50 crc kubenswrapper[4703]: I1011 03:57:50.273950 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-dfkzg" event={"ID":"c9b7ec35-94a9-42b2-b086-a5810df3acf3","Type":"ContainerStarted","Data":"f6c5296cb61d504387e0c4550336fa4d24883c968b1fef0c2c76382290823bd0"} Oct 11 03:57:50 crc kubenswrapper[4703]: I1011 03:57:50.274003 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-dfkzg" event={"ID":"c9b7ec35-94a9-42b2-b086-a5810df3acf3","Type":"ContainerStarted","Data":"9fb1cb6068fd4de4f9df7194bff2b68ce16759f06bbfb619c69bb0ec092dda11"} Oct 11 03:57:50 crc kubenswrapper[4703]: I1011 03:57:50.274634 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-dfkzg" Oct 11 03:57:50 crc kubenswrapper[4703]: I1011 03:57:50.276840 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-dfkzg" Oct 11 03:57:50 crc kubenswrapper[4703]: I1011 03:57:50.309055 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-dfkzg" podStartSLOduration=2.309033069 podStartE2EDuration="2.309033069s" podCreationTimestamp="2025-10-11 03:57:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 03:57:50.299869231 +0000 UTC m=+161.510351153" watchObservedRunningTime="2025-10-11 03:57:50.309033069 +0000 UTC m=+161.519514991" Oct 11 03:57:50 crc kubenswrapper[4703]: I1011 03:57:50.713183 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-kwlnc"] Oct 11 03:57:50 crc kubenswrapper[4703]: E1011 03:57:50.713568 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9301ceea-042d-4d8d-941c-54032cf80047" containerName="registry-server" Oct 11 03:57:50 crc kubenswrapper[4703]: I1011 03:57:50.713594 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="9301ceea-042d-4d8d-941c-54032cf80047" containerName="registry-server" Oct 11 03:57:50 crc kubenswrapper[4703]: E1011 03:57:50.713618 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9307554f-a116-4c31-9585-c712558479fc" containerName="extract-utilities" Oct 11 03:57:50 crc kubenswrapper[4703]: I1011 03:57:50.713635 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="9307554f-a116-4c31-9585-c712558479fc" containerName="extract-utilities" Oct 11 03:57:50 crc kubenswrapper[4703]: E1011 03:57:50.713658 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27051689-7c44-4841-bcbc-7118a66ae0a0" containerName="extract-content" Oct 11 03:57:50 crc kubenswrapper[4703]: I1011 03:57:50.713676 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="27051689-7c44-4841-bcbc-7118a66ae0a0" containerName="extract-content" Oct 11 03:57:50 crc kubenswrapper[4703]: E1011 03:57:50.713696 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27051689-7c44-4841-bcbc-7118a66ae0a0" containerName="registry-server" Oct 11 03:57:50 crc kubenswrapper[4703]: I1011 03:57:50.713708 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="27051689-7c44-4841-bcbc-7118a66ae0a0" containerName="registry-server" Oct 11 03:57:50 crc kubenswrapper[4703]: E1011 03:57:50.713722 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9301ceea-042d-4d8d-941c-54032cf80047" containerName="extract-utilities" Oct 11 03:57:50 crc kubenswrapper[4703]: I1011 03:57:50.713734 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="9301ceea-042d-4d8d-941c-54032cf80047" containerName="extract-utilities" Oct 11 03:57:50 crc kubenswrapper[4703]: E1011 03:57:50.713747 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d85389b5-5e4e-4939-8f96-902c5a3ed9e2" containerName="registry-server" Oct 11 03:57:50 crc kubenswrapper[4703]: I1011 03:57:50.713761 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="d85389b5-5e4e-4939-8f96-902c5a3ed9e2" containerName="registry-server" Oct 11 03:57:50 crc kubenswrapper[4703]: E1011 03:57:50.713777 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d85389b5-5e4e-4939-8f96-902c5a3ed9e2" containerName="extract-utilities" Oct 11 03:57:50 crc kubenswrapper[4703]: I1011 03:57:50.713789 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="d85389b5-5e4e-4939-8f96-902c5a3ed9e2" containerName="extract-utilities" Oct 11 03:57:50 crc kubenswrapper[4703]: E1011 03:57:50.713805 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27051689-7c44-4841-bcbc-7118a66ae0a0" containerName="extract-utilities" Oct 11 03:57:50 crc kubenswrapper[4703]: I1011 03:57:50.713817 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="27051689-7c44-4841-bcbc-7118a66ae0a0" containerName="extract-utilities" Oct 11 03:57:50 crc kubenswrapper[4703]: E1011 03:57:50.713835 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9307554f-a116-4c31-9585-c712558479fc" containerName="registry-server" Oct 11 03:57:50 crc kubenswrapper[4703]: I1011 03:57:50.713847 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="9307554f-a116-4c31-9585-c712558479fc" containerName="registry-server" Oct 11 03:57:50 crc kubenswrapper[4703]: E1011 03:57:50.713867 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9307554f-a116-4c31-9585-c712558479fc" containerName="extract-content" Oct 11 03:57:50 crc kubenswrapper[4703]: I1011 03:57:50.713880 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="9307554f-a116-4c31-9585-c712558479fc" containerName="extract-content" Oct 11 03:57:50 crc kubenswrapper[4703]: E1011 03:57:50.713898 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d85389b5-5e4e-4939-8f96-902c5a3ed9e2" containerName="extract-content" Oct 11 03:57:50 crc kubenswrapper[4703]: I1011 03:57:50.713910 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="d85389b5-5e4e-4939-8f96-902c5a3ed9e2" containerName="extract-content" Oct 11 03:57:50 crc kubenswrapper[4703]: E1011 03:57:50.713928 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9301ceea-042d-4d8d-941c-54032cf80047" containerName="extract-content" Oct 11 03:57:50 crc kubenswrapper[4703]: I1011 03:57:50.713940 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="9301ceea-042d-4d8d-941c-54032cf80047" containerName="extract-content" Oct 11 03:57:50 crc kubenswrapper[4703]: E1011 03:57:50.713955 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7664ffa-e88d-468b-b43d-3b3b6ec5195b" containerName="marketplace-operator" Oct 11 03:57:50 crc kubenswrapper[4703]: I1011 03:57:50.713967 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7664ffa-e88d-468b-b43d-3b3b6ec5195b" containerName="marketplace-operator" Oct 11 03:57:50 crc kubenswrapper[4703]: I1011 03:57:50.714185 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7664ffa-e88d-468b-b43d-3b3b6ec5195b" containerName="marketplace-operator" Oct 11 03:57:50 crc kubenswrapper[4703]: I1011 03:57:50.714213 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="9301ceea-042d-4d8d-941c-54032cf80047" containerName="registry-server" Oct 11 03:57:50 crc kubenswrapper[4703]: I1011 03:57:50.714241 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="d85389b5-5e4e-4939-8f96-902c5a3ed9e2" containerName="registry-server" Oct 11 03:57:50 crc kubenswrapper[4703]: I1011 03:57:50.714256 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="27051689-7c44-4841-bcbc-7118a66ae0a0" containerName="registry-server" Oct 11 03:57:50 crc kubenswrapper[4703]: I1011 03:57:50.714274 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="9307554f-a116-4c31-9585-c712558479fc" containerName="registry-server" Oct 11 03:57:50 crc kubenswrapper[4703]: I1011 03:57:50.716216 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kwlnc" Oct 11 03:57:50 crc kubenswrapper[4703]: I1011 03:57:50.718650 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Oct 11 03:57:50 crc kubenswrapper[4703]: I1011 03:57:50.738452 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-kwlnc"] Oct 11 03:57:50 crc kubenswrapper[4703]: I1011 03:57:50.778984 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xnqj8\" (UniqueName: \"kubernetes.io/projected/f65a18cc-3c4c-427d-b3cf-82b33c238e47-kube-api-access-xnqj8\") pod \"redhat-marketplace-kwlnc\" (UID: \"f65a18cc-3c4c-427d-b3cf-82b33c238e47\") " pod="openshift-marketplace/redhat-marketplace-kwlnc" Oct 11 03:57:50 crc kubenswrapper[4703]: I1011 03:57:50.779057 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f65a18cc-3c4c-427d-b3cf-82b33c238e47-catalog-content\") pod \"redhat-marketplace-kwlnc\" (UID: \"f65a18cc-3c4c-427d-b3cf-82b33c238e47\") " pod="openshift-marketplace/redhat-marketplace-kwlnc" Oct 11 03:57:50 crc kubenswrapper[4703]: I1011 03:57:50.779250 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f65a18cc-3c4c-427d-b3cf-82b33c238e47-utilities\") pod \"redhat-marketplace-kwlnc\" (UID: \"f65a18cc-3c4c-427d-b3cf-82b33c238e47\") " pod="openshift-marketplace/redhat-marketplace-kwlnc" Oct 11 03:57:50 crc kubenswrapper[4703]: I1011 03:57:50.880562 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f65a18cc-3c4c-427d-b3cf-82b33c238e47-catalog-content\") pod \"redhat-marketplace-kwlnc\" (UID: \"f65a18cc-3c4c-427d-b3cf-82b33c238e47\") " pod="openshift-marketplace/redhat-marketplace-kwlnc" Oct 11 03:57:50 crc kubenswrapper[4703]: I1011 03:57:50.880700 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f65a18cc-3c4c-427d-b3cf-82b33c238e47-utilities\") pod \"redhat-marketplace-kwlnc\" (UID: \"f65a18cc-3c4c-427d-b3cf-82b33c238e47\") " pod="openshift-marketplace/redhat-marketplace-kwlnc" Oct 11 03:57:50 crc kubenswrapper[4703]: I1011 03:57:50.880757 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xnqj8\" (UniqueName: \"kubernetes.io/projected/f65a18cc-3c4c-427d-b3cf-82b33c238e47-kube-api-access-xnqj8\") pod \"redhat-marketplace-kwlnc\" (UID: \"f65a18cc-3c4c-427d-b3cf-82b33c238e47\") " pod="openshift-marketplace/redhat-marketplace-kwlnc" Oct 11 03:57:50 crc kubenswrapper[4703]: I1011 03:57:50.881204 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f65a18cc-3c4c-427d-b3cf-82b33c238e47-utilities\") pod \"redhat-marketplace-kwlnc\" (UID: \"f65a18cc-3c4c-427d-b3cf-82b33c238e47\") " pod="openshift-marketplace/redhat-marketplace-kwlnc" Oct 11 03:57:50 crc kubenswrapper[4703]: I1011 03:57:50.881375 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f65a18cc-3c4c-427d-b3cf-82b33c238e47-catalog-content\") pod \"redhat-marketplace-kwlnc\" (UID: \"f65a18cc-3c4c-427d-b3cf-82b33c238e47\") " pod="openshift-marketplace/redhat-marketplace-kwlnc" Oct 11 03:57:50 crc kubenswrapper[4703]: I1011 03:57:50.916441 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-vhh4v"] Oct 11 03:57:50 crc kubenswrapper[4703]: I1011 03:57:50.918189 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vhh4v" Oct 11 03:57:50 crc kubenswrapper[4703]: I1011 03:57:50.921558 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Oct 11 03:57:50 crc kubenswrapper[4703]: I1011 03:57:50.935924 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vhh4v"] Oct 11 03:57:50 crc kubenswrapper[4703]: I1011 03:57:50.938315 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xnqj8\" (UniqueName: \"kubernetes.io/projected/f65a18cc-3c4c-427d-b3cf-82b33c238e47-kube-api-access-xnqj8\") pod \"redhat-marketplace-kwlnc\" (UID: \"f65a18cc-3c4c-427d-b3cf-82b33c238e47\") " pod="openshift-marketplace/redhat-marketplace-kwlnc" Oct 11 03:57:50 crc kubenswrapper[4703]: I1011 03:57:50.981558 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f10f15cc-0c90-45a7-8f0e-d258775abff7-utilities\") pod \"redhat-operators-vhh4v\" (UID: \"f10f15cc-0c90-45a7-8f0e-d258775abff7\") " pod="openshift-marketplace/redhat-operators-vhh4v" Oct 11 03:57:50 crc kubenswrapper[4703]: I1011 03:57:50.981661 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-njxwd\" (UniqueName: \"kubernetes.io/projected/f10f15cc-0c90-45a7-8f0e-d258775abff7-kube-api-access-njxwd\") pod \"redhat-operators-vhh4v\" (UID: \"f10f15cc-0c90-45a7-8f0e-d258775abff7\") " pod="openshift-marketplace/redhat-operators-vhh4v" Oct 11 03:57:50 crc kubenswrapper[4703]: I1011 03:57:50.981702 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f10f15cc-0c90-45a7-8f0e-d258775abff7-catalog-content\") pod \"redhat-operators-vhh4v\" (UID: \"f10f15cc-0c90-45a7-8f0e-d258775abff7\") " pod="openshift-marketplace/redhat-operators-vhh4v" Oct 11 03:57:51 crc kubenswrapper[4703]: I1011 03:57:51.080864 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kwlnc" Oct 11 03:57:51 crc kubenswrapper[4703]: I1011 03:57:51.094687 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f10f15cc-0c90-45a7-8f0e-d258775abff7-catalog-content\") pod \"redhat-operators-vhh4v\" (UID: \"f10f15cc-0c90-45a7-8f0e-d258775abff7\") " pod="openshift-marketplace/redhat-operators-vhh4v" Oct 11 03:57:51 crc kubenswrapper[4703]: I1011 03:57:51.094765 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f10f15cc-0c90-45a7-8f0e-d258775abff7-utilities\") pod \"redhat-operators-vhh4v\" (UID: \"f10f15cc-0c90-45a7-8f0e-d258775abff7\") " pod="openshift-marketplace/redhat-operators-vhh4v" Oct 11 03:57:51 crc kubenswrapper[4703]: I1011 03:57:51.095420 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f10f15cc-0c90-45a7-8f0e-d258775abff7-utilities\") pod \"redhat-operators-vhh4v\" (UID: \"f10f15cc-0c90-45a7-8f0e-d258775abff7\") " pod="openshift-marketplace/redhat-operators-vhh4v" Oct 11 03:57:51 crc kubenswrapper[4703]: I1011 03:57:51.095417 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f10f15cc-0c90-45a7-8f0e-d258775abff7-catalog-content\") pod \"redhat-operators-vhh4v\" (UID: \"f10f15cc-0c90-45a7-8f0e-d258775abff7\") " pod="openshift-marketplace/redhat-operators-vhh4v" Oct 11 03:57:51 crc kubenswrapper[4703]: I1011 03:57:51.095853 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-njxwd\" (UniqueName: \"kubernetes.io/projected/f10f15cc-0c90-45a7-8f0e-d258775abff7-kube-api-access-njxwd\") pod \"redhat-operators-vhh4v\" (UID: \"f10f15cc-0c90-45a7-8f0e-d258775abff7\") " pod="openshift-marketplace/redhat-operators-vhh4v" Oct 11 03:57:51 crc kubenswrapper[4703]: I1011 03:57:51.114795 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-njxwd\" (UniqueName: \"kubernetes.io/projected/f10f15cc-0c90-45a7-8f0e-d258775abff7-kube-api-access-njxwd\") pod \"redhat-operators-vhh4v\" (UID: \"f10f15cc-0c90-45a7-8f0e-d258775abff7\") " pod="openshift-marketplace/redhat-operators-vhh4v" Oct 11 03:57:51 crc kubenswrapper[4703]: I1011 03:57:51.252277 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vhh4v" Oct 11 03:57:51 crc kubenswrapper[4703]: I1011 03:57:51.290166 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-kwlnc"] Oct 11 03:57:51 crc kubenswrapper[4703]: W1011 03:57:51.301098 4703 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf65a18cc_3c4c_427d_b3cf_82b33c238e47.slice/crio-e205c507671a5cfe240b866fce7712ad6e16a1938e2ec18d6b0480a96f402107 WatchSource:0}: Error finding container e205c507671a5cfe240b866fce7712ad6e16a1938e2ec18d6b0480a96f402107: Status 404 returned error can't find the container with id e205c507671a5cfe240b866fce7712ad6e16a1938e2ec18d6b0480a96f402107 Oct 11 03:57:51 crc kubenswrapper[4703]: E1011 03:57:51.541730 4703 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf65a18cc_3c4c_427d_b3cf_82b33c238e47.slice/crio-conmon-0953748dfaa4de5a42d24a8185e56494f546e87e6704f775ca12b06fb2a230da.scope\": RecentStats: unable to find data in memory cache]" Oct 11 03:57:51 crc kubenswrapper[4703]: I1011 03:57:51.665489 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vhh4v"] Oct 11 03:57:51 crc kubenswrapper[4703]: W1011 03:57:51.674411 4703 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf10f15cc_0c90_45a7_8f0e_d258775abff7.slice/crio-56492111fcc268a4105c43c4c30a6969b33654fe2065d6a7dbf1f3f52793d0d3 WatchSource:0}: Error finding container 56492111fcc268a4105c43c4c30a6969b33654fe2065d6a7dbf1f3f52793d0d3: Status 404 returned error can't find the container with id 56492111fcc268a4105c43c4c30a6969b33654fe2065d6a7dbf1f3f52793d0d3 Oct 11 03:57:52 crc kubenswrapper[4703]: I1011 03:57:52.302768 4703 generic.go:334] "Generic (PLEG): container finished" podID="f65a18cc-3c4c-427d-b3cf-82b33c238e47" containerID="0953748dfaa4de5a42d24a8185e56494f546e87e6704f775ca12b06fb2a230da" exitCode=0 Oct 11 03:57:52 crc kubenswrapper[4703]: I1011 03:57:52.302922 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kwlnc" event={"ID":"f65a18cc-3c4c-427d-b3cf-82b33c238e47","Type":"ContainerDied","Data":"0953748dfaa4de5a42d24a8185e56494f546e87e6704f775ca12b06fb2a230da"} Oct 11 03:57:52 crc kubenswrapper[4703]: I1011 03:57:52.303323 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kwlnc" event={"ID":"f65a18cc-3c4c-427d-b3cf-82b33c238e47","Type":"ContainerStarted","Data":"e205c507671a5cfe240b866fce7712ad6e16a1938e2ec18d6b0480a96f402107"} Oct 11 03:57:52 crc kubenswrapper[4703]: I1011 03:57:52.308597 4703 generic.go:334] "Generic (PLEG): container finished" podID="f10f15cc-0c90-45a7-8f0e-d258775abff7" containerID="d83cf51db0be7684ab20a50b7a60f6ccadf73f340809e0f738d5b2374a180b98" exitCode=0 Oct 11 03:57:52 crc kubenswrapper[4703]: I1011 03:57:52.308630 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vhh4v" event={"ID":"f10f15cc-0c90-45a7-8f0e-d258775abff7","Type":"ContainerDied","Data":"d83cf51db0be7684ab20a50b7a60f6ccadf73f340809e0f738d5b2374a180b98"} Oct 11 03:57:52 crc kubenswrapper[4703]: I1011 03:57:52.308688 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vhh4v" event={"ID":"f10f15cc-0c90-45a7-8f0e-d258775abff7","Type":"ContainerStarted","Data":"56492111fcc268a4105c43c4c30a6969b33654fe2065d6a7dbf1f3f52793d0d3"} Oct 11 03:57:53 crc kubenswrapper[4703]: I1011 03:57:53.115965 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-pcrkn"] Oct 11 03:57:53 crc kubenswrapper[4703]: I1011 03:57:53.118023 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pcrkn" Oct 11 03:57:53 crc kubenswrapper[4703]: I1011 03:57:53.123051 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Oct 11 03:57:53 crc kubenswrapper[4703]: I1011 03:57:53.128688 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-pcrkn"] Oct 11 03:57:53 crc kubenswrapper[4703]: I1011 03:57:53.226479 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9b932657-cddd-4fd6-b45b-074f292386da-utilities\") pod \"certified-operators-pcrkn\" (UID: \"9b932657-cddd-4fd6-b45b-074f292386da\") " pod="openshift-marketplace/certified-operators-pcrkn" Oct 11 03:57:53 crc kubenswrapper[4703]: I1011 03:57:53.226526 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9b932657-cddd-4fd6-b45b-074f292386da-catalog-content\") pod \"certified-operators-pcrkn\" (UID: \"9b932657-cddd-4fd6-b45b-074f292386da\") " pod="openshift-marketplace/certified-operators-pcrkn" Oct 11 03:57:53 crc kubenswrapper[4703]: I1011 03:57:53.226573 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-52rjk\" (UniqueName: \"kubernetes.io/projected/9b932657-cddd-4fd6-b45b-074f292386da-kube-api-access-52rjk\") pod \"certified-operators-pcrkn\" (UID: \"9b932657-cddd-4fd6-b45b-074f292386da\") " pod="openshift-marketplace/certified-operators-pcrkn" Oct 11 03:57:53 crc kubenswrapper[4703]: I1011 03:57:53.310225 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-8zb8v"] Oct 11 03:57:53 crc kubenswrapper[4703]: I1011 03:57:53.311691 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8zb8v" Oct 11 03:57:53 crc kubenswrapper[4703]: I1011 03:57:53.315073 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Oct 11 03:57:53 crc kubenswrapper[4703]: I1011 03:57:53.318107 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8zb8v"] Oct 11 03:57:53 crc kubenswrapper[4703]: I1011 03:57:53.335009 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-52rjk\" (UniqueName: \"kubernetes.io/projected/9b932657-cddd-4fd6-b45b-074f292386da-kube-api-access-52rjk\") pod \"certified-operators-pcrkn\" (UID: \"9b932657-cddd-4fd6-b45b-074f292386da\") " pod="openshift-marketplace/certified-operators-pcrkn" Oct 11 03:57:53 crc kubenswrapper[4703]: I1011 03:57:53.335106 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9b932657-cddd-4fd6-b45b-074f292386da-utilities\") pod \"certified-operators-pcrkn\" (UID: \"9b932657-cddd-4fd6-b45b-074f292386da\") " pod="openshift-marketplace/certified-operators-pcrkn" Oct 11 03:57:53 crc kubenswrapper[4703]: I1011 03:57:53.335147 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9b932657-cddd-4fd6-b45b-074f292386da-catalog-content\") pod \"certified-operators-pcrkn\" (UID: \"9b932657-cddd-4fd6-b45b-074f292386da\") " pod="openshift-marketplace/certified-operators-pcrkn" Oct 11 03:57:53 crc kubenswrapper[4703]: I1011 03:57:53.335139 4703 generic.go:334] "Generic (PLEG): container finished" podID="f65a18cc-3c4c-427d-b3cf-82b33c238e47" containerID="1c30abfd4c7f31e673074cd1797dd0b126985bab32a2c7dd8548191c12b49bd0" exitCode=0 Oct 11 03:57:53 crc kubenswrapper[4703]: I1011 03:57:53.335542 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kwlnc" event={"ID":"f65a18cc-3c4c-427d-b3cf-82b33c238e47","Type":"ContainerDied","Data":"1c30abfd4c7f31e673074cd1797dd0b126985bab32a2c7dd8548191c12b49bd0"} Oct 11 03:57:53 crc kubenswrapper[4703]: I1011 03:57:53.335654 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9b932657-cddd-4fd6-b45b-074f292386da-catalog-content\") pod \"certified-operators-pcrkn\" (UID: \"9b932657-cddd-4fd6-b45b-074f292386da\") " pod="openshift-marketplace/certified-operators-pcrkn" Oct 11 03:57:53 crc kubenswrapper[4703]: I1011 03:57:53.336255 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9b932657-cddd-4fd6-b45b-074f292386da-utilities\") pod \"certified-operators-pcrkn\" (UID: \"9b932657-cddd-4fd6-b45b-074f292386da\") " pod="openshift-marketplace/certified-operators-pcrkn" Oct 11 03:57:53 crc kubenswrapper[4703]: I1011 03:57:53.348711 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vhh4v" event={"ID":"f10f15cc-0c90-45a7-8f0e-d258775abff7","Type":"ContainerStarted","Data":"e86a8b4b593964d176df1529c1de83b05f2a1c0089e0d8febb60f08857f033c8"} Oct 11 03:57:53 crc kubenswrapper[4703]: I1011 03:57:53.360862 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-52rjk\" (UniqueName: \"kubernetes.io/projected/9b932657-cddd-4fd6-b45b-074f292386da-kube-api-access-52rjk\") pod \"certified-operators-pcrkn\" (UID: \"9b932657-cddd-4fd6-b45b-074f292386da\") " pod="openshift-marketplace/certified-operators-pcrkn" Oct 11 03:57:53 crc kubenswrapper[4703]: I1011 03:57:53.436575 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/08edae5f-0299-4ef5-98d1-3f1be67bfb35-utilities\") pod \"community-operators-8zb8v\" (UID: \"08edae5f-0299-4ef5-98d1-3f1be67bfb35\") " pod="openshift-marketplace/community-operators-8zb8v" Oct 11 03:57:53 crc kubenswrapper[4703]: I1011 03:57:53.437086 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/08edae5f-0299-4ef5-98d1-3f1be67bfb35-catalog-content\") pod \"community-operators-8zb8v\" (UID: \"08edae5f-0299-4ef5-98d1-3f1be67bfb35\") " pod="openshift-marketplace/community-operators-8zb8v" Oct 11 03:57:53 crc kubenswrapper[4703]: I1011 03:57:53.437249 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fnr2q\" (UniqueName: \"kubernetes.io/projected/08edae5f-0299-4ef5-98d1-3f1be67bfb35-kube-api-access-fnr2q\") pod \"community-operators-8zb8v\" (UID: \"08edae5f-0299-4ef5-98d1-3f1be67bfb35\") " pod="openshift-marketplace/community-operators-8zb8v" Oct 11 03:57:53 crc kubenswrapper[4703]: I1011 03:57:53.538702 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/08edae5f-0299-4ef5-98d1-3f1be67bfb35-utilities\") pod \"community-operators-8zb8v\" (UID: \"08edae5f-0299-4ef5-98d1-3f1be67bfb35\") " pod="openshift-marketplace/community-operators-8zb8v" Oct 11 03:57:53 crc kubenswrapper[4703]: I1011 03:57:53.538793 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/08edae5f-0299-4ef5-98d1-3f1be67bfb35-catalog-content\") pod \"community-operators-8zb8v\" (UID: \"08edae5f-0299-4ef5-98d1-3f1be67bfb35\") " pod="openshift-marketplace/community-operators-8zb8v" Oct 11 03:57:53 crc kubenswrapper[4703]: I1011 03:57:53.538826 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fnr2q\" (UniqueName: \"kubernetes.io/projected/08edae5f-0299-4ef5-98d1-3f1be67bfb35-kube-api-access-fnr2q\") pod \"community-operators-8zb8v\" (UID: \"08edae5f-0299-4ef5-98d1-3f1be67bfb35\") " pod="openshift-marketplace/community-operators-8zb8v" Oct 11 03:57:53 crc kubenswrapper[4703]: I1011 03:57:53.540164 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/08edae5f-0299-4ef5-98d1-3f1be67bfb35-catalog-content\") pod \"community-operators-8zb8v\" (UID: \"08edae5f-0299-4ef5-98d1-3f1be67bfb35\") " pod="openshift-marketplace/community-operators-8zb8v" Oct 11 03:57:53 crc kubenswrapper[4703]: I1011 03:57:53.540412 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/08edae5f-0299-4ef5-98d1-3f1be67bfb35-utilities\") pod \"community-operators-8zb8v\" (UID: \"08edae5f-0299-4ef5-98d1-3f1be67bfb35\") " pod="openshift-marketplace/community-operators-8zb8v" Oct 11 03:57:53 crc kubenswrapper[4703]: I1011 03:57:53.541063 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pcrkn" Oct 11 03:57:53 crc kubenswrapper[4703]: I1011 03:57:53.557694 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fnr2q\" (UniqueName: \"kubernetes.io/projected/08edae5f-0299-4ef5-98d1-3f1be67bfb35-kube-api-access-fnr2q\") pod \"community-operators-8zb8v\" (UID: \"08edae5f-0299-4ef5-98d1-3f1be67bfb35\") " pod="openshift-marketplace/community-operators-8zb8v" Oct 11 03:57:53 crc kubenswrapper[4703]: I1011 03:57:53.648148 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8zb8v" Oct 11 03:57:53 crc kubenswrapper[4703]: I1011 03:57:53.946940 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-pcrkn"] Oct 11 03:57:54 crc kubenswrapper[4703]: I1011 03:57:54.028287 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8zb8v"] Oct 11 03:57:54 crc kubenswrapper[4703]: I1011 03:57:54.355685 4703 generic.go:334] "Generic (PLEG): container finished" podID="08edae5f-0299-4ef5-98d1-3f1be67bfb35" containerID="ccb11cb2c1751d100b45b525d8ead01b5fc0fa3b703cb1f9a5940cee7ceb95ee" exitCode=0 Oct 11 03:57:54 crc kubenswrapper[4703]: I1011 03:57:54.355750 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8zb8v" event={"ID":"08edae5f-0299-4ef5-98d1-3f1be67bfb35","Type":"ContainerDied","Data":"ccb11cb2c1751d100b45b525d8ead01b5fc0fa3b703cb1f9a5940cee7ceb95ee"} Oct 11 03:57:54 crc kubenswrapper[4703]: I1011 03:57:54.355812 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8zb8v" event={"ID":"08edae5f-0299-4ef5-98d1-3f1be67bfb35","Type":"ContainerStarted","Data":"f0681daa4aaf330f5140d550d8caf8647d56424d33e581b0ed31a892adfaa83c"} Oct 11 03:57:54 crc kubenswrapper[4703]: I1011 03:57:54.357995 4703 generic.go:334] "Generic (PLEG): container finished" podID="9b932657-cddd-4fd6-b45b-074f292386da" containerID="d4f48db4f97ef29f4a87317178cc2eee7d434775ed539817e5ca8faea2ac03f4" exitCode=0 Oct 11 03:57:54 crc kubenswrapper[4703]: I1011 03:57:54.358048 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pcrkn" event={"ID":"9b932657-cddd-4fd6-b45b-074f292386da","Type":"ContainerDied","Data":"d4f48db4f97ef29f4a87317178cc2eee7d434775ed539817e5ca8faea2ac03f4"} Oct 11 03:57:54 crc kubenswrapper[4703]: I1011 03:57:54.358073 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pcrkn" event={"ID":"9b932657-cddd-4fd6-b45b-074f292386da","Type":"ContainerStarted","Data":"87e60fe96b796ac4fe526391f63bad32a55ebbbfabca4dd7712158b6282b0ab3"} Oct 11 03:57:54 crc kubenswrapper[4703]: I1011 03:57:54.360658 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kwlnc" event={"ID":"f65a18cc-3c4c-427d-b3cf-82b33c238e47","Type":"ContainerStarted","Data":"243a0bb2415bed8cccd54f3f41491462e996b067909b475ff076b29ce059f400"} Oct 11 03:57:54 crc kubenswrapper[4703]: I1011 03:57:54.362491 4703 generic.go:334] "Generic (PLEG): container finished" podID="f10f15cc-0c90-45a7-8f0e-d258775abff7" containerID="e86a8b4b593964d176df1529c1de83b05f2a1c0089e0d8febb60f08857f033c8" exitCode=0 Oct 11 03:57:54 crc kubenswrapper[4703]: I1011 03:57:54.362542 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vhh4v" event={"ID":"f10f15cc-0c90-45a7-8f0e-d258775abff7","Type":"ContainerDied","Data":"e86a8b4b593964d176df1529c1de83b05f2a1c0089e0d8febb60f08857f033c8"} Oct 11 03:57:54 crc kubenswrapper[4703]: I1011 03:57:54.416757 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-kwlnc" podStartSLOduration=2.972947802 podStartE2EDuration="4.416734486s" podCreationTimestamp="2025-10-11 03:57:50 +0000 UTC" firstStartedPulling="2025-10-11 03:57:52.308187065 +0000 UTC m=+163.518668987" lastFinishedPulling="2025-10-11 03:57:53.751973749 +0000 UTC m=+164.962455671" observedRunningTime="2025-10-11 03:57:54.416445827 +0000 UTC m=+165.626927769" watchObservedRunningTime="2025-10-11 03:57:54.416734486 +0000 UTC m=+165.627216408" Oct 11 03:57:55 crc kubenswrapper[4703]: I1011 03:57:55.384071 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8zb8v" event={"ID":"08edae5f-0299-4ef5-98d1-3f1be67bfb35","Type":"ContainerStarted","Data":"4b11125e58cf108799d14f187d53fb8250977144039502dd8d57d734495af876"} Oct 11 03:57:55 crc kubenswrapper[4703]: I1011 03:57:55.392136 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vhh4v" event={"ID":"f10f15cc-0c90-45a7-8f0e-d258775abff7","Type":"ContainerStarted","Data":"1dd74550618fa4a86611b2245ddc59a4fa92f8930b3f7c36eb5c8f318c58f529"} Oct 11 03:57:55 crc kubenswrapper[4703]: I1011 03:57:55.420930 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-vhh4v" podStartSLOduration=2.975630721 podStartE2EDuration="5.420912518s" podCreationTimestamp="2025-10-11 03:57:50 +0000 UTC" firstStartedPulling="2025-10-11 03:57:52.310198087 +0000 UTC m=+163.520680049" lastFinishedPulling="2025-10-11 03:57:54.755479924 +0000 UTC m=+165.965961846" observedRunningTime="2025-10-11 03:57:55.416533134 +0000 UTC m=+166.627015056" watchObservedRunningTime="2025-10-11 03:57:55.420912518 +0000 UTC m=+166.631394450" Oct 11 03:57:56 crc kubenswrapper[4703]: I1011 03:57:56.402210 4703 generic.go:334] "Generic (PLEG): container finished" podID="08edae5f-0299-4ef5-98d1-3f1be67bfb35" containerID="4b11125e58cf108799d14f187d53fb8250977144039502dd8d57d734495af876" exitCode=0 Oct 11 03:57:56 crc kubenswrapper[4703]: I1011 03:57:56.403545 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8zb8v" event={"ID":"08edae5f-0299-4ef5-98d1-3f1be67bfb35","Type":"ContainerDied","Data":"4b11125e58cf108799d14f187d53fb8250977144039502dd8d57d734495af876"} Oct 11 03:57:57 crc kubenswrapper[4703]: I1011 03:57:57.408349 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8zb8v" event={"ID":"08edae5f-0299-4ef5-98d1-3f1be67bfb35","Type":"ContainerStarted","Data":"268e60a802c65b4d57e1abcd6834d9fc4970df6ca1eb94b90e3b296c0061f97c"} Oct 11 03:57:57 crc kubenswrapper[4703]: I1011 03:57:57.411740 4703 generic.go:334] "Generic (PLEG): container finished" podID="9b932657-cddd-4fd6-b45b-074f292386da" containerID="7c527f225ac836e783ecd8c74b04dc242d5777632dcd46d65607827c522c2742" exitCode=0 Oct 11 03:57:57 crc kubenswrapper[4703]: I1011 03:57:57.411781 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pcrkn" event={"ID":"9b932657-cddd-4fd6-b45b-074f292386da","Type":"ContainerDied","Data":"7c527f225ac836e783ecd8c74b04dc242d5777632dcd46d65607827c522c2742"} Oct 11 03:57:57 crc kubenswrapper[4703]: I1011 03:57:57.450454 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-8zb8v" podStartSLOduration=1.9339137069999999 podStartE2EDuration="4.45043814s" podCreationTimestamp="2025-10-11 03:57:53 +0000 UTC" firstStartedPulling="2025-10-11 03:57:54.359642016 +0000 UTC m=+165.570123938" lastFinishedPulling="2025-10-11 03:57:56.876166449 +0000 UTC m=+168.086648371" observedRunningTime="2025-10-11 03:57:57.435088193 +0000 UTC m=+168.645570105" watchObservedRunningTime="2025-10-11 03:57:57.45043814 +0000 UTC m=+168.660920062" Oct 11 03:57:58 crc kubenswrapper[4703]: I1011 03:57:58.418641 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pcrkn" event={"ID":"9b932657-cddd-4fd6-b45b-074f292386da","Type":"ContainerStarted","Data":"3f8567292abb7c950c6619f33e65fd4c30eb7685f164aa02642f7e24d5f2ec46"} Oct 11 03:57:58 crc kubenswrapper[4703]: I1011 03:57:58.440288 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-pcrkn" podStartSLOduration=1.898986852 podStartE2EDuration="5.440271921s" podCreationTimestamp="2025-10-11 03:57:53 +0000 UTC" firstStartedPulling="2025-10-11 03:57:54.359602275 +0000 UTC m=+165.570084197" lastFinishedPulling="2025-10-11 03:57:57.900887344 +0000 UTC m=+169.111369266" observedRunningTime="2025-10-11 03:57:58.438827614 +0000 UTC m=+169.649309556" watchObservedRunningTime="2025-10-11 03:57:58.440271921 +0000 UTC m=+169.650753843" Oct 11 03:58:01 crc kubenswrapper[4703]: I1011 03:58:01.081098 4703 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-kwlnc" Oct 11 03:58:01 crc kubenswrapper[4703]: I1011 03:58:01.081516 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-kwlnc" Oct 11 03:58:01 crc kubenswrapper[4703]: I1011 03:58:01.137442 4703 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-kwlnc" Oct 11 03:58:01 crc kubenswrapper[4703]: I1011 03:58:01.252645 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-vhh4v" Oct 11 03:58:01 crc kubenswrapper[4703]: I1011 03:58:01.252715 4703 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-vhh4v" Oct 11 03:58:01 crc kubenswrapper[4703]: I1011 03:58:01.312945 4703 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-vhh4v" Oct 11 03:58:01 crc kubenswrapper[4703]: I1011 03:58:01.500857 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-vhh4v" Oct 11 03:58:01 crc kubenswrapper[4703]: I1011 03:58:01.513075 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-kwlnc" Oct 11 03:58:03 crc kubenswrapper[4703]: I1011 03:58:03.542159 4703 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-pcrkn" Oct 11 03:58:03 crc kubenswrapper[4703]: I1011 03:58:03.542537 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-pcrkn" Oct 11 03:58:03 crc kubenswrapper[4703]: I1011 03:58:03.590734 4703 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-pcrkn" Oct 11 03:58:03 crc kubenswrapper[4703]: I1011 03:58:03.649791 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-8zb8v" Oct 11 03:58:03 crc kubenswrapper[4703]: I1011 03:58:03.649837 4703 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-8zb8v" Oct 11 03:58:03 crc kubenswrapper[4703]: I1011 03:58:03.709007 4703 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-8zb8v" Oct 11 03:58:04 crc kubenswrapper[4703]: I1011 03:58:04.521845 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-pcrkn" Oct 11 03:58:04 crc kubenswrapper[4703]: I1011 03:58:04.530373 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-8zb8v" Oct 11 03:58:20 crc kubenswrapper[4703]: I1011 03:58:20.255188 4703 patch_prober.go:28] interesting pod/machine-config-daemon-6b7d5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 11 03:58:20 crc kubenswrapper[4703]: I1011 03:58:20.255871 4703 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6b7d5" podUID="74f832fc-6791-47d6-a9b3-07d923e053dc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 11 03:58:50 crc kubenswrapper[4703]: I1011 03:58:50.255161 4703 patch_prober.go:28] interesting pod/machine-config-daemon-6b7d5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 11 03:58:50 crc kubenswrapper[4703]: I1011 03:58:50.255875 4703 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6b7d5" podUID="74f832fc-6791-47d6-a9b3-07d923e053dc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 11 03:58:50 crc kubenswrapper[4703]: I1011 03:58:50.255949 4703 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-6b7d5" Oct 11 03:58:50 crc kubenswrapper[4703]: I1011 03:58:50.256812 4703 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a4f73efcf3bef36e6452025ae44770270a644dd02150c7a3e760ab13c1488d22"} pod="openshift-machine-config-operator/machine-config-daemon-6b7d5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 11 03:58:50 crc kubenswrapper[4703]: I1011 03:58:50.256918 4703 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-6b7d5" podUID="74f832fc-6791-47d6-a9b3-07d923e053dc" containerName="machine-config-daemon" containerID="cri-o://a4f73efcf3bef36e6452025ae44770270a644dd02150c7a3e760ab13c1488d22" gracePeriod=600 Oct 11 03:58:50 crc kubenswrapper[4703]: I1011 03:58:50.774783 4703 generic.go:334] "Generic (PLEG): container finished" podID="74f832fc-6791-47d6-a9b3-07d923e053dc" containerID="a4f73efcf3bef36e6452025ae44770270a644dd02150c7a3e760ab13c1488d22" exitCode=0 Oct 11 03:58:50 crc kubenswrapper[4703]: I1011 03:58:50.774883 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6b7d5" event={"ID":"74f832fc-6791-47d6-a9b3-07d923e053dc","Type":"ContainerDied","Data":"a4f73efcf3bef36e6452025ae44770270a644dd02150c7a3e760ab13c1488d22"} Oct 11 03:58:50 crc kubenswrapper[4703]: I1011 03:58:50.775198 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6b7d5" event={"ID":"74f832fc-6791-47d6-a9b3-07d923e053dc","Type":"ContainerStarted","Data":"8f08e932c3a4125c9b6b2754aa44c21fa7de5a4f6158a78e3dc1896527e30434"} Oct 11 04:00:00 crc kubenswrapper[4703]: I1011 04:00:00.146894 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29335920-gxjs4"] Oct 11 04:00:00 crc kubenswrapper[4703]: I1011 04:00:00.149740 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29335920-gxjs4" Oct 11 04:00:00 crc kubenswrapper[4703]: I1011 04:00:00.152293 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 11 04:00:00 crc kubenswrapper[4703]: I1011 04:00:00.152616 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 11 04:00:00 crc kubenswrapper[4703]: I1011 04:00:00.156839 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29335920-gxjs4"] Oct 11 04:00:00 crc kubenswrapper[4703]: I1011 04:00:00.226784 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-82pkt\" (UniqueName: \"kubernetes.io/projected/594d951e-e78f-4165-8d22-c5db39fd5ab9-kube-api-access-82pkt\") pod \"collect-profiles-29335920-gxjs4\" (UID: \"594d951e-e78f-4165-8d22-c5db39fd5ab9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29335920-gxjs4" Oct 11 04:00:00 crc kubenswrapper[4703]: I1011 04:00:00.226846 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/594d951e-e78f-4165-8d22-c5db39fd5ab9-config-volume\") pod \"collect-profiles-29335920-gxjs4\" (UID: \"594d951e-e78f-4165-8d22-c5db39fd5ab9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29335920-gxjs4" Oct 11 04:00:00 crc kubenswrapper[4703]: I1011 04:00:00.226934 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/594d951e-e78f-4165-8d22-c5db39fd5ab9-secret-volume\") pod \"collect-profiles-29335920-gxjs4\" (UID: \"594d951e-e78f-4165-8d22-c5db39fd5ab9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29335920-gxjs4" Oct 11 04:00:00 crc kubenswrapper[4703]: I1011 04:00:00.328423 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/594d951e-e78f-4165-8d22-c5db39fd5ab9-secret-volume\") pod \"collect-profiles-29335920-gxjs4\" (UID: \"594d951e-e78f-4165-8d22-c5db39fd5ab9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29335920-gxjs4" Oct 11 04:00:00 crc kubenswrapper[4703]: I1011 04:00:00.328626 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-82pkt\" (UniqueName: \"kubernetes.io/projected/594d951e-e78f-4165-8d22-c5db39fd5ab9-kube-api-access-82pkt\") pod \"collect-profiles-29335920-gxjs4\" (UID: \"594d951e-e78f-4165-8d22-c5db39fd5ab9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29335920-gxjs4" Oct 11 04:00:00 crc kubenswrapper[4703]: I1011 04:00:00.328673 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/594d951e-e78f-4165-8d22-c5db39fd5ab9-config-volume\") pod \"collect-profiles-29335920-gxjs4\" (UID: \"594d951e-e78f-4165-8d22-c5db39fd5ab9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29335920-gxjs4" Oct 11 04:00:00 crc kubenswrapper[4703]: I1011 04:00:00.330285 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/594d951e-e78f-4165-8d22-c5db39fd5ab9-config-volume\") pod \"collect-profiles-29335920-gxjs4\" (UID: \"594d951e-e78f-4165-8d22-c5db39fd5ab9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29335920-gxjs4" Oct 11 04:00:00 crc kubenswrapper[4703]: I1011 04:00:00.338206 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/594d951e-e78f-4165-8d22-c5db39fd5ab9-secret-volume\") pod \"collect-profiles-29335920-gxjs4\" (UID: \"594d951e-e78f-4165-8d22-c5db39fd5ab9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29335920-gxjs4" Oct 11 04:00:00 crc kubenswrapper[4703]: I1011 04:00:00.344902 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-82pkt\" (UniqueName: \"kubernetes.io/projected/594d951e-e78f-4165-8d22-c5db39fd5ab9-kube-api-access-82pkt\") pod \"collect-profiles-29335920-gxjs4\" (UID: \"594d951e-e78f-4165-8d22-c5db39fd5ab9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29335920-gxjs4" Oct 11 04:00:00 crc kubenswrapper[4703]: I1011 04:00:00.467080 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29335920-gxjs4" Oct 11 04:00:00 crc kubenswrapper[4703]: I1011 04:00:00.679090 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29335920-gxjs4"] Oct 11 04:00:01 crc kubenswrapper[4703]: I1011 04:00:01.211415 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29335920-gxjs4" event={"ID":"594d951e-e78f-4165-8d22-c5db39fd5ab9","Type":"ContainerStarted","Data":"c7b543a11168c19826097799f8c5051bc23d256e45df76af7c5aa334194721f8"} Oct 11 04:00:01 crc kubenswrapper[4703]: I1011 04:00:01.212759 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29335920-gxjs4" event={"ID":"594d951e-e78f-4165-8d22-c5db39fd5ab9","Type":"ContainerStarted","Data":"7c430a96fde6307b794167cfb80df9b3e47904c53d4191719360982543827495"} Oct 11 04:00:02 crc kubenswrapper[4703]: I1011 04:00:02.220714 4703 generic.go:334] "Generic (PLEG): container finished" podID="594d951e-e78f-4165-8d22-c5db39fd5ab9" containerID="c7b543a11168c19826097799f8c5051bc23d256e45df76af7c5aa334194721f8" exitCode=0 Oct 11 04:00:02 crc kubenswrapper[4703]: I1011 04:00:02.220776 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29335920-gxjs4" event={"ID":"594d951e-e78f-4165-8d22-c5db39fd5ab9","Type":"ContainerDied","Data":"c7b543a11168c19826097799f8c5051bc23d256e45df76af7c5aa334194721f8"} Oct 11 04:00:03 crc kubenswrapper[4703]: I1011 04:00:03.461249 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29335920-gxjs4" Oct 11 04:00:03 crc kubenswrapper[4703]: I1011 04:00:03.568189 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/594d951e-e78f-4165-8d22-c5db39fd5ab9-secret-volume\") pod \"594d951e-e78f-4165-8d22-c5db39fd5ab9\" (UID: \"594d951e-e78f-4165-8d22-c5db39fd5ab9\") " Oct 11 04:00:03 crc kubenswrapper[4703]: I1011 04:00:03.568696 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/594d951e-e78f-4165-8d22-c5db39fd5ab9-config-volume\") pod \"594d951e-e78f-4165-8d22-c5db39fd5ab9\" (UID: \"594d951e-e78f-4165-8d22-c5db39fd5ab9\") " Oct 11 04:00:03 crc kubenswrapper[4703]: I1011 04:00:03.568891 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-82pkt\" (UniqueName: \"kubernetes.io/projected/594d951e-e78f-4165-8d22-c5db39fd5ab9-kube-api-access-82pkt\") pod \"594d951e-e78f-4165-8d22-c5db39fd5ab9\" (UID: \"594d951e-e78f-4165-8d22-c5db39fd5ab9\") " Oct 11 04:00:03 crc kubenswrapper[4703]: I1011 04:00:03.569146 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/594d951e-e78f-4165-8d22-c5db39fd5ab9-config-volume" (OuterVolumeSpecName: "config-volume") pod "594d951e-e78f-4165-8d22-c5db39fd5ab9" (UID: "594d951e-e78f-4165-8d22-c5db39fd5ab9"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 04:00:03 crc kubenswrapper[4703]: I1011 04:00:03.570558 4703 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/594d951e-e78f-4165-8d22-c5db39fd5ab9-config-volume\") on node \"crc\" DevicePath \"\"" Oct 11 04:00:03 crc kubenswrapper[4703]: I1011 04:00:03.577123 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/594d951e-e78f-4165-8d22-c5db39fd5ab9-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "594d951e-e78f-4165-8d22-c5db39fd5ab9" (UID: "594d951e-e78f-4165-8d22-c5db39fd5ab9"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 04:00:03 crc kubenswrapper[4703]: I1011 04:00:03.577169 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/594d951e-e78f-4165-8d22-c5db39fd5ab9-kube-api-access-82pkt" (OuterVolumeSpecName: "kube-api-access-82pkt") pod "594d951e-e78f-4165-8d22-c5db39fd5ab9" (UID: "594d951e-e78f-4165-8d22-c5db39fd5ab9"). InnerVolumeSpecName "kube-api-access-82pkt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 04:00:03 crc kubenswrapper[4703]: I1011 04:00:03.672976 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-82pkt\" (UniqueName: \"kubernetes.io/projected/594d951e-e78f-4165-8d22-c5db39fd5ab9-kube-api-access-82pkt\") on node \"crc\" DevicePath \"\"" Oct 11 04:00:03 crc kubenswrapper[4703]: I1011 04:00:03.673030 4703 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/594d951e-e78f-4165-8d22-c5db39fd5ab9-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 11 04:00:04 crc kubenswrapper[4703]: I1011 04:00:04.234130 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29335920-gxjs4" event={"ID":"594d951e-e78f-4165-8d22-c5db39fd5ab9","Type":"ContainerDied","Data":"7c430a96fde6307b794167cfb80df9b3e47904c53d4191719360982543827495"} Oct 11 04:00:04 crc kubenswrapper[4703]: I1011 04:00:04.234192 4703 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7c430a96fde6307b794167cfb80df9b3e47904c53d4191719360982543827495" Oct 11 04:00:04 crc kubenswrapper[4703]: I1011 04:00:04.234261 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29335920-gxjs4" Oct 11 04:00:24 crc kubenswrapper[4703]: I1011 04:00:24.135884 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-p77zr"] Oct 11 04:00:24 crc kubenswrapper[4703]: E1011 04:00:24.137737 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="594d951e-e78f-4165-8d22-c5db39fd5ab9" containerName="collect-profiles" Oct 11 04:00:24 crc kubenswrapper[4703]: I1011 04:00:24.137859 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="594d951e-e78f-4165-8d22-c5db39fd5ab9" containerName="collect-profiles" Oct 11 04:00:24 crc kubenswrapper[4703]: I1011 04:00:24.138103 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="594d951e-e78f-4165-8d22-c5db39fd5ab9" containerName="collect-profiles" Oct 11 04:00:24 crc kubenswrapper[4703]: I1011 04:00:24.139737 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-p77zr" Oct 11 04:00:24 crc kubenswrapper[4703]: I1011 04:00:24.151494 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-p77zr"] Oct 11 04:00:24 crc kubenswrapper[4703]: I1011 04:00:24.272610 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/52bfa4d6-08c0-449b-921c-df332781f6f2-bound-sa-token\") pod \"image-registry-66df7c8f76-p77zr\" (UID: \"52bfa4d6-08c0-449b-921c-df332781f6f2\") " pod="openshift-image-registry/image-registry-66df7c8f76-p77zr" Oct 11 04:00:24 crc kubenswrapper[4703]: I1011 04:00:24.272691 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/52bfa4d6-08c0-449b-921c-df332781f6f2-registry-certificates\") pod \"image-registry-66df7c8f76-p77zr\" (UID: \"52bfa4d6-08c0-449b-921c-df332781f6f2\") " pod="openshift-image-registry/image-registry-66df7c8f76-p77zr" Oct 11 04:00:24 crc kubenswrapper[4703]: I1011 04:00:24.272733 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/52bfa4d6-08c0-449b-921c-df332781f6f2-installation-pull-secrets\") pod \"image-registry-66df7c8f76-p77zr\" (UID: \"52bfa4d6-08c0-449b-921c-df332781f6f2\") " pod="openshift-image-registry/image-registry-66df7c8f76-p77zr" Oct 11 04:00:24 crc kubenswrapper[4703]: I1011 04:00:24.272819 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/52bfa4d6-08c0-449b-921c-df332781f6f2-trusted-ca\") pod \"image-registry-66df7c8f76-p77zr\" (UID: \"52bfa4d6-08c0-449b-921c-df332781f6f2\") " pod="openshift-image-registry/image-registry-66df7c8f76-p77zr" Oct 11 04:00:24 crc kubenswrapper[4703]: I1011 04:00:24.272948 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/52bfa4d6-08c0-449b-921c-df332781f6f2-ca-trust-extracted\") pod \"image-registry-66df7c8f76-p77zr\" (UID: \"52bfa4d6-08c0-449b-921c-df332781f6f2\") " pod="openshift-image-registry/image-registry-66df7c8f76-p77zr" Oct 11 04:00:24 crc kubenswrapper[4703]: I1011 04:00:24.273166 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-msgz9\" (UniqueName: \"kubernetes.io/projected/52bfa4d6-08c0-449b-921c-df332781f6f2-kube-api-access-msgz9\") pod \"image-registry-66df7c8f76-p77zr\" (UID: \"52bfa4d6-08c0-449b-921c-df332781f6f2\") " pod="openshift-image-registry/image-registry-66df7c8f76-p77zr" Oct 11 04:00:24 crc kubenswrapper[4703]: I1011 04:00:24.273239 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-p77zr\" (UID: \"52bfa4d6-08c0-449b-921c-df332781f6f2\") " pod="openshift-image-registry/image-registry-66df7c8f76-p77zr" Oct 11 04:00:24 crc kubenswrapper[4703]: I1011 04:00:24.273306 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/52bfa4d6-08c0-449b-921c-df332781f6f2-registry-tls\") pod \"image-registry-66df7c8f76-p77zr\" (UID: \"52bfa4d6-08c0-449b-921c-df332781f6f2\") " pod="openshift-image-registry/image-registry-66df7c8f76-p77zr" Oct 11 04:00:24 crc kubenswrapper[4703]: I1011 04:00:24.300105 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-p77zr\" (UID: \"52bfa4d6-08c0-449b-921c-df332781f6f2\") " pod="openshift-image-registry/image-registry-66df7c8f76-p77zr" Oct 11 04:00:24 crc kubenswrapper[4703]: I1011 04:00:24.374308 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-msgz9\" (UniqueName: \"kubernetes.io/projected/52bfa4d6-08c0-449b-921c-df332781f6f2-kube-api-access-msgz9\") pod \"image-registry-66df7c8f76-p77zr\" (UID: \"52bfa4d6-08c0-449b-921c-df332781f6f2\") " pod="openshift-image-registry/image-registry-66df7c8f76-p77zr" Oct 11 04:00:24 crc kubenswrapper[4703]: I1011 04:00:24.374382 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/52bfa4d6-08c0-449b-921c-df332781f6f2-registry-tls\") pod \"image-registry-66df7c8f76-p77zr\" (UID: \"52bfa4d6-08c0-449b-921c-df332781f6f2\") " pod="openshift-image-registry/image-registry-66df7c8f76-p77zr" Oct 11 04:00:24 crc kubenswrapper[4703]: I1011 04:00:24.374413 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/52bfa4d6-08c0-449b-921c-df332781f6f2-bound-sa-token\") pod \"image-registry-66df7c8f76-p77zr\" (UID: \"52bfa4d6-08c0-449b-921c-df332781f6f2\") " pod="openshift-image-registry/image-registry-66df7c8f76-p77zr" Oct 11 04:00:24 crc kubenswrapper[4703]: I1011 04:00:24.374433 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/52bfa4d6-08c0-449b-921c-df332781f6f2-registry-certificates\") pod \"image-registry-66df7c8f76-p77zr\" (UID: \"52bfa4d6-08c0-449b-921c-df332781f6f2\") " pod="openshift-image-registry/image-registry-66df7c8f76-p77zr" Oct 11 04:00:24 crc kubenswrapper[4703]: I1011 04:00:24.374457 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/52bfa4d6-08c0-449b-921c-df332781f6f2-installation-pull-secrets\") pod \"image-registry-66df7c8f76-p77zr\" (UID: \"52bfa4d6-08c0-449b-921c-df332781f6f2\") " pod="openshift-image-registry/image-registry-66df7c8f76-p77zr" Oct 11 04:00:24 crc kubenswrapper[4703]: I1011 04:00:24.374505 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/52bfa4d6-08c0-449b-921c-df332781f6f2-trusted-ca\") pod \"image-registry-66df7c8f76-p77zr\" (UID: \"52bfa4d6-08c0-449b-921c-df332781f6f2\") " pod="openshift-image-registry/image-registry-66df7c8f76-p77zr" Oct 11 04:00:24 crc kubenswrapper[4703]: I1011 04:00:24.374525 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/52bfa4d6-08c0-449b-921c-df332781f6f2-ca-trust-extracted\") pod \"image-registry-66df7c8f76-p77zr\" (UID: \"52bfa4d6-08c0-449b-921c-df332781f6f2\") " pod="openshift-image-registry/image-registry-66df7c8f76-p77zr" Oct 11 04:00:24 crc kubenswrapper[4703]: I1011 04:00:24.375100 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/52bfa4d6-08c0-449b-921c-df332781f6f2-ca-trust-extracted\") pod \"image-registry-66df7c8f76-p77zr\" (UID: \"52bfa4d6-08c0-449b-921c-df332781f6f2\") " pod="openshift-image-registry/image-registry-66df7c8f76-p77zr" Oct 11 04:00:24 crc kubenswrapper[4703]: I1011 04:00:24.375714 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/52bfa4d6-08c0-449b-921c-df332781f6f2-registry-certificates\") pod \"image-registry-66df7c8f76-p77zr\" (UID: \"52bfa4d6-08c0-449b-921c-df332781f6f2\") " pod="openshift-image-registry/image-registry-66df7c8f76-p77zr" Oct 11 04:00:24 crc kubenswrapper[4703]: I1011 04:00:24.377169 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/52bfa4d6-08c0-449b-921c-df332781f6f2-trusted-ca\") pod \"image-registry-66df7c8f76-p77zr\" (UID: \"52bfa4d6-08c0-449b-921c-df332781f6f2\") " pod="openshift-image-registry/image-registry-66df7c8f76-p77zr" Oct 11 04:00:24 crc kubenswrapper[4703]: I1011 04:00:24.383117 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/52bfa4d6-08c0-449b-921c-df332781f6f2-registry-tls\") pod \"image-registry-66df7c8f76-p77zr\" (UID: \"52bfa4d6-08c0-449b-921c-df332781f6f2\") " pod="openshift-image-registry/image-registry-66df7c8f76-p77zr" Oct 11 04:00:24 crc kubenswrapper[4703]: I1011 04:00:24.384009 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/52bfa4d6-08c0-449b-921c-df332781f6f2-installation-pull-secrets\") pod \"image-registry-66df7c8f76-p77zr\" (UID: \"52bfa4d6-08c0-449b-921c-df332781f6f2\") " pod="openshift-image-registry/image-registry-66df7c8f76-p77zr" Oct 11 04:00:24 crc kubenswrapper[4703]: I1011 04:00:24.398987 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/52bfa4d6-08c0-449b-921c-df332781f6f2-bound-sa-token\") pod \"image-registry-66df7c8f76-p77zr\" (UID: \"52bfa4d6-08c0-449b-921c-df332781f6f2\") " pod="openshift-image-registry/image-registry-66df7c8f76-p77zr" Oct 11 04:00:24 crc kubenswrapper[4703]: I1011 04:00:24.405996 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-msgz9\" (UniqueName: \"kubernetes.io/projected/52bfa4d6-08c0-449b-921c-df332781f6f2-kube-api-access-msgz9\") pod \"image-registry-66df7c8f76-p77zr\" (UID: \"52bfa4d6-08c0-449b-921c-df332781f6f2\") " pod="openshift-image-registry/image-registry-66df7c8f76-p77zr" Oct 11 04:00:24 crc kubenswrapper[4703]: I1011 04:00:24.459244 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-p77zr" Oct 11 04:00:24 crc kubenswrapper[4703]: I1011 04:00:24.755018 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-p77zr"] Oct 11 04:00:25 crc kubenswrapper[4703]: I1011 04:00:25.377973 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-p77zr" event={"ID":"52bfa4d6-08c0-449b-921c-df332781f6f2","Type":"ContainerStarted","Data":"6faea0c11411d11e87c331465814c4f02a87630fff57105e8aa36c87d1aee078"} Oct 11 04:00:25 crc kubenswrapper[4703]: I1011 04:00:25.378487 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-p77zr" event={"ID":"52bfa4d6-08c0-449b-921c-df332781f6f2","Type":"ContainerStarted","Data":"cede850e26bef44010ee3f203182dcb212f8fb2f2528a4d56e6e75ceb20c1b42"} Oct 11 04:00:25 crc kubenswrapper[4703]: I1011 04:00:25.378524 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-p77zr" Oct 11 04:00:25 crc kubenswrapper[4703]: I1011 04:00:25.408073 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-p77zr" podStartSLOduration=1.408051414 podStartE2EDuration="1.408051414s" podCreationTimestamp="2025-10-11 04:00:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 04:00:25.404817268 +0000 UTC m=+316.615299210" watchObservedRunningTime="2025-10-11 04:00:25.408051414 +0000 UTC m=+316.618533376" Oct 11 04:00:44 crc kubenswrapper[4703]: I1011 04:00:44.466343 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-p77zr" Oct 11 04:00:44 crc kubenswrapper[4703]: I1011 04:00:44.542741 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-q44w9"] Oct 11 04:00:50 crc kubenswrapper[4703]: I1011 04:00:50.255787 4703 patch_prober.go:28] interesting pod/machine-config-daemon-6b7d5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 11 04:00:50 crc kubenswrapper[4703]: I1011 04:00:50.256322 4703 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6b7d5" podUID="74f832fc-6791-47d6-a9b3-07d923e053dc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 11 04:01:09 crc kubenswrapper[4703]: I1011 04:01:09.601569 4703 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-q44w9" podUID="92095718-20ba-4b03-b949-e9a26009e283" containerName="registry" containerID="cri-o://962b74a79ca119d66ddf70d2af94ff70995d285c975176afdbf4aefcc6769da6" gracePeriod=30 Oct 11 04:01:09 crc kubenswrapper[4703]: I1011 04:01:09.956869 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-q44w9" Oct 11 04:01:10 crc kubenswrapper[4703]: I1011 04:01:10.078019 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/92095718-20ba-4b03-b949-e9a26009e283-installation-pull-secrets\") pod \"92095718-20ba-4b03-b949-e9a26009e283\" (UID: \"92095718-20ba-4b03-b949-e9a26009e283\") " Oct 11 04:01:10 crc kubenswrapper[4703]: I1011 04:01:10.078616 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/92095718-20ba-4b03-b949-e9a26009e283-registry-certificates\") pod \"92095718-20ba-4b03-b949-e9a26009e283\" (UID: \"92095718-20ba-4b03-b949-e9a26009e283\") " Oct 11 04:01:10 crc kubenswrapper[4703]: I1011 04:01:10.078791 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/92095718-20ba-4b03-b949-e9a26009e283-trusted-ca\") pod \"92095718-20ba-4b03-b949-e9a26009e283\" (UID: \"92095718-20ba-4b03-b949-e9a26009e283\") " Oct 11 04:01:10 crc kubenswrapper[4703]: I1011 04:01:10.078859 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/92095718-20ba-4b03-b949-e9a26009e283-ca-trust-extracted\") pod \"92095718-20ba-4b03-b949-e9a26009e283\" (UID: \"92095718-20ba-4b03-b949-e9a26009e283\") " Oct 11 04:01:10 crc kubenswrapper[4703]: I1011 04:01:10.078918 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-98w9g\" (UniqueName: \"kubernetes.io/projected/92095718-20ba-4b03-b949-e9a26009e283-kube-api-access-98w9g\") pod \"92095718-20ba-4b03-b949-e9a26009e283\" (UID: \"92095718-20ba-4b03-b949-e9a26009e283\") " Oct 11 04:01:10 crc kubenswrapper[4703]: I1011 04:01:10.078993 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/92095718-20ba-4b03-b949-e9a26009e283-bound-sa-token\") pod \"92095718-20ba-4b03-b949-e9a26009e283\" (UID: \"92095718-20ba-4b03-b949-e9a26009e283\") " Oct 11 04:01:10 crc kubenswrapper[4703]: I1011 04:01:10.079262 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"92095718-20ba-4b03-b949-e9a26009e283\" (UID: \"92095718-20ba-4b03-b949-e9a26009e283\") " Oct 11 04:01:10 crc kubenswrapper[4703]: I1011 04:01:10.079345 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/92095718-20ba-4b03-b949-e9a26009e283-registry-tls\") pod \"92095718-20ba-4b03-b949-e9a26009e283\" (UID: \"92095718-20ba-4b03-b949-e9a26009e283\") " Oct 11 04:01:10 crc kubenswrapper[4703]: I1011 04:01:10.079634 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/92095718-20ba-4b03-b949-e9a26009e283-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "92095718-20ba-4b03-b949-e9a26009e283" (UID: "92095718-20ba-4b03-b949-e9a26009e283"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 04:01:10 crc kubenswrapper[4703]: I1011 04:01:10.079798 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/92095718-20ba-4b03-b949-e9a26009e283-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "92095718-20ba-4b03-b949-e9a26009e283" (UID: "92095718-20ba-4b03-b949-e9a26009e283"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 04:01:10 crc kubenswrapper[4703]: I1011 04:01:10.079957 4703 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/92095718-20ba-4b03-b949-e9a26009e283-registry-certificates\") on node \"crc\" DevicePath \"\"" Oct 11 04:01:10 crc kubenswrapper[4703]: I1011 04:01:10.079982 4703 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/92095718-20ba-4b03-b949-e9a26009e283-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 11 04:01:10 crc kubenswrapper[4703]: I1011 04:01:10.100873 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/92095718-20ba-4b03-b949-e9a26009e283-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "92095718-20ba-4b03-b949-e9a26009e283" (UID: "92095718-20ba-4b03-b949-e9a26009e283"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 04:01:10 crc kubenswrapper[4703]: I1011 04:01:10.182260 4703 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/92095718-20ba-4b03-b949-e9a26009e283-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Oct 11 04:01:10 crc kubenswrapper[4703]: I1011 04:01:10.253064 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/92095718-20ba-4b03-b949-e9a26009e283-kube-api-access-98w9g" (OuterVolumeSpecName: "kube-api-access-98w9g") pod "92095718-20ba-4b03-b949-e9a26009e283" (UID: "92095718-20ba-4b03-b949-e9a26009e283"). InnerVolumeSpecName "kube-api-access-98w9g". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 04:01:10 crc kubenswrapper[4703]: I1011 04:01:10.253149 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92095718-20ba-4b03-b949-e9a26009e283-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "92095718-20ba-4b03-b949-e9a26009e283" (UID: "92095718-20ba-4b03-b949-e9a26009e283"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 04:01:10 crc kubenswrapper[4703]: I1011 04:01:10.254120 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/92095718-20ba-4b03-b949-e9a26009e283-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "92095718-20ba-4b03-b949-e9a26009e283" (UID: "92095718-20ba-4b03-b949-e9a26009e283"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 04:01:10 crc kubenswrapper[4703]: I1011 04:01:10.254458 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/92095718-20ba-4b03-b949-e9a26009e283-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "92095718-20ba-4b03-b949-e9a26009e283" (UID: "92095718-20ba-4b03-b949-e9a26009e283"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 04:01:10 crc kubenswrapper[4703]: I1011 04:01:10.266590 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "92095718-20ba-4b03-b949-e9a26009e283" (UID: "92095718-20ba-4b03-b949-e9a26009e283"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Oct 11 04:01:10 crc kubenswrapper[4703]: I1011 04:01:10.284040 4703 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/92095718-20ba-4b03-b949-e9a26009e283-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Oct 11 04:01:10 crc kubenswrapper[4703]: I1011 04:01:10.284091 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-98w9g\" (UniqueName: \"kubernetes.io/projected/92095718-20ba-4b03-b949-e9a26009e283-kube-api-access-98w9g\") on node \"crc\" DevicePath \"\"" Oct 11 04:01:10 crc kubenswrapper[4703]: I1011 04:01:10.284113 4703 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/92095718-20ba-4b03-b949-e9a26009e283-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 11 04:01:10 crc kubenswrapper[4703]: I1011 04:01:10.284130 4703 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/92095718-20ba-4b03-b949-e9a26009e283-registry-tls\") on node \"crc\" DevicePath \"\"" Oct 11 04:01:10 crc kubenswrapper[4703]: I1011 04:01:10.683423 4703 generic.go:334] "Generic (PLEG): container finished" podID="92095718-20ba-4b03-b949-e9a26009e283" containerID="962b74a79ca119d66ddf70d2af94ff70995d285c975176afdbf4aefcc6769da6" exitCode=0 Oct 11 04:01:10 crc kubenswrapper[4703]: I1011 04:01:10.683551 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-q44w9" event={"ID":"92095718-20ba-4b03-b949-e9a26009e283","Type":"ContainerDied","Data":"962b74a79ca119d66ddf70d2af94ff70995d285c975176afdbf4aefcc6769da6"} Oct 11 04:01:10 crc kubenswrapper[4703]: I1011 04:01:10.683575 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-q44w9" Oct 11 04:01:10 crc kubenswrapper[4703]: I1011 04:01:10.683667 4703 scope.go:117] "RemoveContainer" containerID="962b74a79ca119d66ddf70d2af94ff70995d285c975176afdbf4aefcc6769da6" Oct 11 04:01:10 crc kubenswrapper[4703]: I1011 04:01:10.683637 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-q44w9" event={"ID":"92095718-20ba-4b03-b949-e9a26009e283","Type":"ContainerDied","Data":"e758d79d1a62e78e6af72ad4ce6eeca1a5922845895f24b985711e2f6bf0a285"} Oct 11 04:01:10 crc kubenswrapper[4703]: I1011 04:01:10.714652 4703 scope.go:117] "RemoveContainer" containerID="962b74a79ca119d66ddf70d2af94ff70995d285c975176afdbf4aefcc6769da6" Oct 11 04:01:10 crc kubenswrapper[4703]: E1011 04:01:10.715248 4703 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"962b74a79ca119d66ddf70d2af94ff70995d285c975176afdbf4aefcc6769da6\": container with ID starting with 962b74a79ca119d66ddf70d2af94ff70995d285c975176afdbf4aefcc6769da6 not found: ID does not exist" containerID="962b74a79ca119d66ddf70d2af94ff70995d285c975176afdbf4aefcc6769da6" Oct 11 04:01:10 crc kubenswrapper[4703]: I1011 04:01:10.715290 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"962b74a79ca119d66ddf70d2af94ff70995d285c975176afdbf4aefcc6769da6"} err="failed to get container status \"962b74a79ca119d66ddf70d2af94ff70995d285c975176afdbf4aefcc6769da6\": rpc error: code = NotFound desc = could not find container \"962b74a79ca119d66ddf70d2af94ff70995d285c975176afdbf4aefcc6769da6\": container with ID starting with 962b74a79ca119d66ddf70d2af94ff70995d285c975176afdbf4aefcc6769da6 not found: ID does not exist" Oct 11 04:01:10 crc kubenswrapper[4703]: I1011 04:01:10.734165 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-q44w9"] Oct 11 04:01:10 crc kubenswrapper[4703]: I1011 04:01:10.739365 4703 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-q44w9"] Oct 11 04:01:11 crc kubenswrapper[4703]: I1011 04:01:11.547895 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="92095718-20ba-4b03-b949-e9a26009e283" path="/var/lib/kubelet/pods/92095718-20ba-4b03-b949-e9a26009e283/volumes" Oct 11 04:01:20 crc kubenswrapper[4703]: I1011 04:01:20.254803 4703 patch_prober.go:28] interesting pod/machine-config-daemon-6b7d5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 11 04:01:20 crc kubenswrapper[4703]: I1011 04:01:20.255669 4703 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6b7d5" podUID="74f832fc-6791-47d6-a9b3-07d923e053dc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 11 04:01:50 crc kubenswrapper[4703]: I1011 04:01:50.255921 4703 patch_prober.go:28] interesting pod/machine-config-daemon-6b7d5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 11 04:01:50 crc kubenswrapper[4703]: I1011 04:01:50.256901 4703 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6b7d5" podUID="74f832fc-6791-47d6-a9b3-07d923e053dc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 11 04:01:50 crc kubenswrapper[4703]: I1011 04:01:50.258003 4703 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-6b7d5" Oct 11 04:01:50 crc kubenswrapper[4703]: I1011 04:01:50.259852 4703 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8f08e932c3a4125c9b6b2754aa44c21fa7de5a4f6158a78e3dc1896527e30434"} pod="openshift-machine-config-operator/machine-config-daemon-6b7d5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 11 04:01:50 crc kubenswrapper[4703]: I1011 04:01:50.260101 4703 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-6b7d5" podUID="74f832fc-6791-47d6-a9b3-07d923e053dc" containerName="machine-config-daemon" containerID="cri-o://8f08e932c3a4125c9b6b2754aa44c21fa7de5a4f6158a78e3dc1896527e30434" gracePeriod=600 Oct 11 04:01:50 crc kubenswrapper[4703]: I1011 04:01:50.958953 4703 generic.go:334] "Generic (PLEG): container finished" podID="74f832fc-6791-47d6-a9b3-07d923e053dc" containerID="8f08e932c3a4125c9b6b2754aa44c21fa7de5a4f6158a78e3dc1896527e30434" exitCode=0 Oct 11 04:01:50 crc kubenswrapper[4703]: I1011 04:01:50.959066 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6b7d5" event={"ID":"74f832fc-6791-47d6-a9b3-07d923e053dc","Type":"ContainerDied","Data":"8f08e932c3a4125c9b6b2754aa44c21fa7de5a4f6158a78e3dc1896527e30434"} Oct 11 04:01:50 crc kubenswrapper[4703]: I1011 04:01:50.959563 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6b7d5" event={"ID":"74f832fc-6791-47d6-a9b3-07d923e053dc","Type":"ContainerStarted","Data":"0367c11cbaa435a21e96a60e9660edc64fff8a40838e7115f48e8069226f948f"} Oct 11 04:01:50 crc kubenswrapper[4703]: I1011 04:01:50.959606 4703 scope.go:117] "RemoveContainer" containerID="a4f73efcf3bef36e6452025ae44770270a644dd02150c7a3e760ab13c1488d22" Oct 11 04:03:21 crc kubenswrapper[4703]: I1011 04:03:21.643336 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-m4jc5"] Oct 11 04:03:21 crc kubenswrapper[4703]: I1011 04:03:21.646689 4703 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-m4jc5" podUID="8090d9aa-59c5-4c77-a4c0-94f2fa8d4426" containerName="ovn-controller" containerID="cri-o://00e6bcaa58d0cbca2d738a616bb0a5bbf22d4563f2851f11e2c995b08a080789" gracePeriod=30 Oct 11 04:03:21 crc kubenswrapper[4703]: I1011 04:03:21.646954 4703 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-m4jc5" podUID="8090d9aa-59c5-4c77-a4c0-94f2fa8d4426" containerName="northd" containerID="cri-o://f275d204a8d4c20a7e6b6e7897724b671daa662a53ec87bdef257935fe5fbb07" gracePeriod=30 Oct 11 04:03:21 crc kubenswrapper[4703]: I1011 04:03:21.646995 4703 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-m4jc5" podUID="8090d9aa-59c5-4c77-a4c0-94f2fa8d4426" containerName="kube-rbac-proxy-node" containerID="cri-o://cc87197c1fda9865f81c76cb123bdf355d8fea22b3b3c3c1100a3bc375481b8a" gracePeriod=30 Oct 11 04:03:21 crc kubenswrapper[4703]: I1011 04:03:21.647037 4703 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-m4jc5" podUID="8090d9aa-59c5-4c77-a4c0-94f2fa8d4426" containerName="sbdb" containerID="cri-o://7327b4c65a37d4d38fd797a79eada88617bce15e8ca9ea2145f1cd8b2e7a4ea5" gracePeriod=30 Oct 11 04:03:21 crc kubenswrapper[4703]: I1011 04:03:21.647163 4703 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-m4jc5" podUID="8090d9aa-59c5-4c77-a4c0-94f2fa8d4426" containerName="nbdb" containerID="cri-o://2c7f1264e75bed906a827c0eeedcf3120809c53ede363c06629d4219ed82aa42" gracePeriod=30 Oct 11 04:03:21 crc kubenswrapper[4703]: I1011 04:03:21.647195 4703 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-m4jc5" podUID="8090d9aa-59c5-4c77-a4c0-94f2fa8d4426" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://b95e578049157330f58c94e99a339d76cb71eb6672860a9c1364ae204fdc5dc6" gracePeriod=30 Oct 11 04:03:21 crc kubenswrapper[4703]: I1011 04:03:21.647310 4703 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-m4jc5" podUID="8090d9aa-59c5-4c77-a4c0-94f2fa8d4426" containerName="ovn-acl-logging" containerID="cri-o://5acab90dda80ca9a5f26446486eea0689403ad891b85d765af6a439181aa47f2" gracePeriod=30 Oct 11 04:03:21 crc kubenswrapper[4703]: I1011 04:03:21.757852 4703 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-m4jc5" podUID="8090d9aa-59c5-4c77-a4c0-94f2fa8d4426" containerName="ovnkube-controller" containerID="cri-o://de794e2f163f11d0639196abf1633be939f13e7a6925907982886f60f0b6decb" gracePeriod=30 Oct 11 04:03:21 crc kubenswrapper[4703]: I1011 04:03:21.991535 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-m4jc5_8090d9aa-59c5-4c77-a4c0-94f2fa8d4426/ovn-acl-logging/0.log" Oct 11 04:03:21 crc kubenswrapper[4703]: I1011 04:03:21.992421 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-m4jc5_8090d9aa-59c5-4c77-a4c0-94f2fa8d4426/ovn-controller/0.log" Oct 11 04:03:21 crc kubenswrapper[4703]: I1011 04:03:21.993067 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-m4jc5" Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.045655 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-lx5cm"] Oct 11 04:03:22 crc kubenswrapper[4703]: E1011 04:03:22.045867 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8090d9aa-59c5-4c77-a4c0-94f2fa8d4426" containerName="ovnkube-controller" Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.045881 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="8090d9aa-59c5-4c77-a4c0-94f2fa8d4426" containerName="ovnkube-controller" Oct 11 04:03:22 crc kubenswrapper[4703]: E1011 04:03:22.045893 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8090d9aa-59c5-4c77-a4c0-94f2fa8d4426" containerName="nbdb" Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.045902 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="8090d9aa-59c5-4c77-a4c0-94f2fa8d4426" containerName="nbdb" Oct 11 04:03:22 crc kubenswrapper[4703]: E1011 04:03:22.045913 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92095718-20ba-4b03-b949-e9a26009e283" containerName="registry" Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.045921 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="92095718-20ba-4b03-b949-e9a26009e283" containerName="registry" Oct 11 04:03:22 crc kubenswrapper[4703]: E1011 04:03:22.045937 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8090d9aa-59c5-4c77-a4c0-94f2fa8d4426" containerName="kube-rbac-proxy-node" Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.045944 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="8090d9aa-59c5-4c77-a4c0-94f2fa8d4426" containerName="kube-rbac-proxy-node" Oct 11 04:03:22 crc kubenswrapper[4703]: E1011 04:03:22.045954 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8090d9aa-59c5-4c77-a4c0-94f2fa8d4426" containerName="ovn-controller" Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.045962 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="8090d9aa-59c5-4c77-a4c0-94f2fa8d4426" containerName="ovn-controller" Oct 11 04:03:22 crc kubenswrapper[4703]: E1011 04:03:22.045973 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8090d9aa-59c5-4c77-a4c0-94f2fa8d4426" containerName="kubecfg-setup" Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.045981 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="8090d9aa-59c5-4c77-a4c0-94f2fa8d4426" containerName="kubecfg-setup" Oct 11 04:03:22 crc kubenswrapper[4703]: E1011 04:03:22.045993 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8090d9aa-59c5-4c77-a4c0-94f2fa8d4426" containerName="sbdb" Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.046001 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="8090d9aa-59c5-4c77-a4c0-94f2fa8d4426" containerName="sbdb" Oct 11 04:03:22 crc kubenswrapper[4703]: E1011 04:03:22.046012 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8090d9aa-59c5-4c77-a4c0-94f2fa8d4426" containerName="ovn-acl-logging" Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.046020 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="8090d9aa-59c5-4c77-a4c0-94f2fa8d4426" containerName="ovn-acl-logging" Oct 11 04:03:22 crc kubenswrapper[4703]: E1011 04:03:22.046034 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8090d9aa-59c5-4c77-a4c0-94f2fa8d4426" containerName="kube-rbac-proxy-ovn-metrics" Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.046042 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="8090d9aa-59c5-4c77-a4c0-94f2fa8d4426" containerName="kube-rbac-proxy-ovn-metrics" Oct 11 04:03:22 crc kubenswrapper[4703]: E1011 04:03:22.046056 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8090d9aa-59c5-4c77-a4c0-94f2fa8d4426" containerName="northd" Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.046064 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="8090d9aa-59c5-4c77-a4c0-94f2fa8d4426" containerName="northd" Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.046172 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="8090d9aa-59c5-4c77-a4c0-94f2fa8d4426" containerName="ovn-acl-logging" Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.046185 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="8090d9aa-59c5-4c77-a4c0-94f2fa8d4426" containerName="northd" Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.046196 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="8090d9aa-59c5-4c77-a4c0-94f2fa8d4426" containerName="nbdb" Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.046209 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="8090d9aa-59c5-4c77-a4c0-94f2fa8d4426" containerName="kube-rbac-proxy-node" Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.046217 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="8090d9aa-59c5-4c77-a4c0-94f2fa8d4426" containerName="kube-rbac-proxy-ovn-metrics" Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.046226 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="8090d9aa-59c5-4c77-a4c0-94f2fa8d4426" containerName="ovnkube-controller" Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.046238 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="8090d9aa-59c5-4c77-a4c0-94f2fa8d4426" containerName="ovn-controller" Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.046250 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="8090d9aa-59c5-4c77-a4c0-94f2fa8d4426" containerName="sbdb" Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.046259 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="92095718-20ba-4b03-b949-e9a26009e283" containerName="registry" Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.048486 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-lx5cm" Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.133423 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/8090d9aa-59c5-4c77-a4c0-94f2fa8d4426-run-systemd\") pod \"8090d9aa-59c5-4c77-a4c0-94f2fa8d4426\" (UID: \"8090d9aa-59c5-4c77-a4c0-94f2fa8d4426\") " Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.133552 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/8090d9aa-59c5-4c77-a4c0-94f2fa8d4426-ovn-node-metrics-cert\") pod \"8090d9aa-59c5-4c77-a4c0-94f2fa8d4426\" (UID: \"8090d9aa-59c5-4c77-a4c0-94f2fa8d4426\") " Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.133590 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/8090d9aa-59c5-4c77-a4c0-94f2fa8d4426-log-socket\") pod \"8090d9aa-59c5-4c77-a4c0-94f2fa8d4426\" (UID: \"8090d9aa-59c5-4c77-a4c0-94f2fa8d4426\") " Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.133617 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/8090d9aa-59c5-4c77-a4c0-94f2fa8d4426-env-overrides\") pod \"8090d9aa-59c5-4c77-a4c0-94f2fa8d4426\" (UID: \"8090d9aa-59c5-4c77-a4c0-94f2fa8d4426\") " Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.133650 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8090d9aa-59c5-4c77-a4c0-94f2fa8d4426-host-run-netns\") pod \"8090d9aa-59c5-4c77-a4c0-94f2fa8d4426\" (UID: \"8090d9aa-59c5-4c77-a4c0-94f2fa8d4426\") " Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.133671 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/8090d9aa-59c5-4c77-a4c0-94f2fa8d4426-host-cni-netd\") pod \"8090d9aa-59c5-4c77-a4c0-94f2fa8d4426\" (UID: \"8090d9aa-59c5-4c77-a4c0-94f2fa8d4426\") " Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.133694 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8090d9aa-59c5-4c77-a4c0-94f2fa8d4426-host-run-ovn-kubernetes\") pod \"8090d9aa-59c5-4c77-a4c0-94f2fa8d4426\" (UID: \"8090d9aa-59c5-4c77-a4c0-94f2fa8d4426\") " Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.133698 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8090d9aa-59c5-4c77-a4c0-94f2fa8d4426-log-socket" (OuterVolumeSpecName: "log-socket") pod "8090d9aa-59c5-4c77-a4c0-94f2fa8d4426" (UID: "8090d9aa-59c5-4c77-a4c0-94f2fa8d4426"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.133759 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/8090d9aa-59c5-4c77-a4c0-94f2fa8d4426-host-kubelet\") pod \"8090d9aa-59c5-4c77-a4c0-94f2fa8d4426\" (UID: \"8090d9aa-59c5-4c77-a4c0-94f2fa8d4426\") " Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.133765 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8090d9aa-59c5-4c77-a4c0-94f2fa8d4426-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "8090d9aa-59c5-4c77-a4c0-94f2fa8d4426" (UID: "8090d9aa-59c5-4c77-a4c0-94f2fa8d4426"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.133789 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jgh7c\" (UniqueName: \"kubernetes.io/projected/8090d9aa-59c5-4c77-a4c0-94f2fa8d4426-kube-api-access-jgh7c\") pod \"8090d9aa-59c5-4c77-a4c0-94f2fa8d4426\" (UID: \"8090d9aa-59c5-4c77-a4c0-94f2fa8d4426\") " Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.133815 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/8090d9aa-59c5-4c77-a4c0-94f2fa8d4426-run-ovn\") pod \"8090d9aa-59c5-4c77-a4c0-94f2fa8d4426\" (UID: \"8090d9aa-59c5-4c77-a4c0-94f2fa8d4426\") " Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.133843 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/8090d9aa-59c5-4c77-a4c0-94f2fa8d4426-node-log\") pod \"8090d9aa-59c5-4c77-a4c0-94f2fa8d4426\" (UID: \"8090d9aa-59c5-4c77-a4c0-94f2fa8d4426\") " Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.133866 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/8090d9aa-59c5-4c77-a4c0-94f2fa8d4426-ovnkube-config\") pod \"8090d9aa-59c5-4c77-a4c0-94f2fa8d4426\" (UID: \"8090d9aa-59c5-4c77-a4c0-94f2fa8d4426\") " Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.133887 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8090d9aa-59c5-4c77-a4c0-94f2fa8d4426-etc-openvswitch\") pod \"8090d9aa-59c5-4c77-a4c0-94f2fa8d4426\" (UID: \"8090d9aa-59c5-4c77-a4c0-94f2fa8d4426\") " Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.133913 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8090d9aa-59c5-4c77-a4c0-94f2fa8d4426-host-cni-bin\") pod \"8090d9aa-59c5-4c77-a4c0-94f2fa8d4426\" (UID: \"8090d9aa-59c5-4c77-a4c0-94f2fa8d4426\") " Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.133935 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8090d9aa-59c5-4c77-a4c0-94f2fa8d4426-host-var-lib-cni-networks-ovn-kubernetes\") pod \"8090d9aa-59c5-4c77-a4c0-94f2fa8d4426\" (UID: \"8090d9aa-59c5-4c77-a4c0-94f2fa8d4426\") " Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.133960 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8090d9aa-59c5-4c77-a4c0-94f2fa8d4426-var-lib-openvswitch\") pod \"8090d9aa-59c5-4c77-a4c0-94f2fa8d4426\" (UID: \"8090d9aa-59c5-4c77-a4c0-94f2fa8d4426\") " Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.133980 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/8090d9aa-59c5-4c77-a4c0-94f2fa8d4426-systemd-units\") pod \"8090d9aa-59c5-4c77-a4c0-94f2fa8d4426\" (UID: \"8090d9aa-59c5-4c77-a4c0-94f2fa8d4426\") " Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.134005 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/8090d9aa-59c5-4c77-a4c0-94f2fa8d4426-ovnkube-script-lib\") pod \"8090d9aa-59c5-4c77-a4c0-94f2fa8d4426\" (UID: \"8090d9aa-59c5-4c77-a4c0-94f2fa8d4426\") " Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.134055 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/8090d9aa-59c5-4c77-a4c0-94f2fa8d4426-host-slash\") pod \"8090d9aa-59c5-4c77-a4c0-94f2fa8d4426\" (UID: \"8090d9aa-59c5-4c77-a4c0-94f2fa8d4426\") " Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.134076 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8090d9aa-59c5-4c77-a4c0-94f2fa8d4426-run-openvswitch\") pod \"8090d9aa-59c5-4c77-a4c0-94f2fa8d4426\" (UID: \"8090d9aa-59c5-4c77-a4c0-94f2fa8d4426\") " Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.134181 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8090d9aa-59c5-4c77-a4c0-94f2fa8d4426-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "8090d9aa-59c5-4c77-a4c0-94f2fa8d4426" (UID: "8090d9aa-59c5-4c77-a4c0-94f2fa8d4426"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.134222 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0cab307c-2039-45cc-b2ed-24f3c86eec92-var-lib-openvswitch\") pod \"ovnkube-node-lx5cm\" (UID: \"0cab307c-2039-45cc-b2ed-24f3c86eec92\") " pod="openshift-ovn-kubernetes/ovnkube-node-lx5cm" Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.134251 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/0cab307c-2039-45cc-b2ed-24f3c86eec92-log-socket\") pod \"ovnkube-node-lx5cm\" (UID: \"0cab307c-2039-45cc-b2ed-24f3c86eec92\") " pod="openshift-ovn-kubernetes/ovnkube-node-lx5cm" Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.134348 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/0cab307c-2039-45cc-b2ed-24f3c86eec92-host-kubelet\") pod \"ovnkube-node-lx5cm\" (UID: \"0cab307c-2039-45cc-b2ed-24f3c86eec92\") " pod="openshift-ovn-kubernetes/ovnkube-node-lx5cm" Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.134380 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0cab307c-2039-45cc-b2ed-24f3c86eec92-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-lx5cm\" (UID: \"0cab307c-2039-45cc-b2ed-24f3c86eec92\") " pod="openshift-ovn-kubernetes/ovnkube-node-lx5cm" Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.134403 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0cab307c-2039-45cc-b2ed-24f3c86eec92-host-cni-bin\") pod \"ovnkube-node-lx5cm\" (UID: \"0cab307c-2039-45cc-b2ed-24f3c86eec92\") " pod="openshift-ovn-kubernetes/ovnkube-node-lx5cm" Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.134423 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fmfdl\" (UniqueName: \"kubernetes.io/projected/0cab307c-2039-45cc-b2ed-24f3c86eec92-kube-api-access-fmfdl\") pod \"ovnkube-node-lx5cm\" (UID: \"0cab307c-2039-45cc-b2ed-24f3c86eec92\") " pod="openshift-ovn-kubernetes/ovnkube-node-lx5cm" Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.134444 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/0cab307c-2039-45cc-b2ed-24f3c86eec92-run-systemd\") pod \"ovnkube-node-lx5cm\" (UID: \"0cab307c-2039-45cc-b2ed-24f3c86eec92\") " pod="openshift-ovn-kubernetes/ovnkube-node-lx5cm" Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.134493 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0cab307c-2039-45cc-b2ed-24f3c86eec92-run-openvswitch\") pod \"ovnkube-node-lx5cm\" (UID: \"0cab307c-2039-45cc-b2ed-24f3c86eec92\") " pod="openshift-ovn-kubernetes/ovnkube-node-lx5cm" Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.134458 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8090d9aa-59c5-4c77-a4c0-94f2fa8d4426-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "8090d9aa-59c5-4c77-a4c0-94f2fa8d4426" (UID: "8090d9aa-59c5-4c77-a4c0-94f2fa8d4426"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.134520 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/0cab307c-2039-45cc-b2ed-24f3c86eec92-node-log\") pod \"ovnkube-node-lx5cm\" (UID: \"0cab307c-2039-45cc-b2ed-24f3c86eec92\") " pod="openshift-ovn-kubernetes/ovnkube-node-lx5cm" Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.134542 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/0cab307c-2039-45cc-b2ed-24f3c86eec92-ovnkube-config\") pod \"ovnkube-node-lx5cm\" (UID: \"0cab307c-2039-45cc-b2ed-24f3c86eec92\") " pod="openshift-ovn-kubernetes/ovnkube-node-lx5cm" Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.134546 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8090d9aa-59c5-4c77-a4c0-94f2fa8d4426-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "8090d9aa-59c5-4c77-a4c0-94f2fa8d4426" (UID: "8090d9aa-59c5-4c77-a4c0-94f2fa8d4426"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.134571 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/0cab307c-2039-45cc-b2ed-24f3c86eec92-host-cni-netd\") pod \"ovnkube-node-lx5cm\" (UID: \"0cab307c-2039-45cc-b2ed-24f3c86eec92\") " pod="openshift-ovn-kubernetes/ovnkube-node-lx5cm" Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.134590 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8090d9aa-59c5-4c77-a4c0-94f2fa8d4426-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "8090d9aa-59c5-4c77-a4c0-94f2fa8d4426" (UID: "8090d9aa-59c5-4c77-a4c0-94f2fa8d4426"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.134649 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8090d9aa-59c5-4c77-a4c0-94f2fa8d4426-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "8090d9aa-59c5-4c77-a4c0-94f2fa8d4426" (UID: "8090d9aa-59c5-4c77-a4c0-94f2fa8d4426"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.134656 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8090d9aa-59c5-4c77-a4c0-94f2fa8d4426-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "8090d9aa-59c5-4c77-a4c0-94f2fa8d4426" (UID: "8090d9aa-59c5-4c77-a4c0-94f2fa8d4426"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.134678 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8090d9aa-59c5-4c77-a4c0-94f2fa8d4426-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "8090d9aa-59c5-4c77-a4c0-94f2fa8d4426" (UID: "8090d9aa-59c5-4c77-a4c0-94f2fa8d4426"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.134685 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8090d9aa-59c5-4c77-a4c0-94f2fa8d4426-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "8090d9aa-59c5-4c77-a4c0-94f2fa8d4426" (UID: "8090d9aa-59c5-4c77-a4c0-94f2fa8d4426"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.134713 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8090d9aa-59c5-4c77-a4c0-94f2fa8d4426-host-slash" (OuterVolumeSpecName: "host-slash") pod "8090d9aa-59c5-4c77-a4c0-94f2fa8d4426" (UID: "8090d9aa-59c5-4c77-a4c0-94f2fa8d4426"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.134675 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8090d9aa-59c5-4c77-a4c0-94f2fa8d4426-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "8090d9aa-59c5-4c77-a4c0-94f2fa8d4426" (UID: "8090d9aa-59c5-4c77-a4c0-94f2fa8d4426"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.134730 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8090d9aa-59c5-4c77-a4c0-94f2fa8d4426-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "8090d9aa-59c5-4c77-a4c0-94f2fa8d4426" (UID: "8090d9aa-59c5-4c77-a4c0-94f2fa8d4426"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.134746 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8090d9aa-59c5-4c77-a4c0-94f2fa8d4426-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "8090d9aa-59c5-4c77-a4c0-94f2fa8d4426" (UID: "8090d9aa-59c5-4c77-a4c0-94f2fa8d4426"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.134781 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8090d9aa-59c5-4c77-a4c0-94f2fa8d4426-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "8090d9aa-59c5-4c77-a4c0-94f2fa8d4426" (UID: "8090d9aa-59c5-4c77-a4c0-94f2fa8d4426"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.134597 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/0cab307c-2039-45cc-b2ed-24f3c86eec92-ovn-node-metrics-cert\") pod \"ovnkube-node-lx5cm\" (UID: \"0cab307c-2039-45cc-b2ed-24f3c86eec92\") " pod="openshift-ovn-kubernetes/ovnkube-node-lx5cm" Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.134840 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8090d9aa-59c5-4c77-a4c0-94f2fa8d4426-node-log" (OuterVolumeSpecName: "node-log") pod "8090d9aa-59c5-4c77-a4c0-94f2fa8d4426" (UID: "8090d9aa-59c5-4c77-a4c0-94f2fa8d4426"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.134967 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0cab307c-2039-45cc-b2ed-24f3c86eec92-etc-openvswitch\") pod \"ovnkube-node-lx5cm\" (UID: \"0cab307c-2039-45cc-b2ed-24f3c86eec92\") " pod="openshift-ovn-kubernetes/ovnkube-node-lx5cm" Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.135151 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8090d9aa-59c5-4c77-a4c0-94f2fa8d4426-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "8090d9aa-59c5-4c77-a4c0-94f2fa8d4426" (UID: "8090d9aa-59c5-4c77-a4c0-94f2fa8d4426"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.135181 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/0cab307c-2039-45cc-b2ed-24f3c86eec92-env-overrides\") pod \"ovnkube-node-lx5cm\" (UID: \"0cab307c-2039-45cc-b2ed-24f3c86eec92\") " pod="openshift-ovn-kubernetes/ovnkube-node-lx5cm" Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.135233 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/0cab307c-2039-45cc-b2ed-24f3c86eec92-run-ovn\") pod \"ovnkube-node-lx5cm\" (UID: \"0cab307c-2039-45cc-b2ed-24f3c86eec92\") " pod="openshift-ovn-kubernetes/ovnkube-node-lx5cm" Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.135307 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0cab307c-2039-45cc-b2ed-24f3c86eec92-host-run-netns\") pod \"ovnkube-node-lx5cm\" (UID: \"0cab307c-2039-45cc-b2ed-24f3c86eec92\") " pod="openshift-ovn-kubernetes/ovnkube-node-lx5cm" Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.135344 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/0cab307c-2039-45cc-b2ed-24f3c86eec92-host-slash\") pod \"ovnkube-node-lx5cm\" (UID: \"0cab307c-2039-45cc-b2ed-24f3c86eec92\") " pod="openshift-ovn-kubernetes/ovnkube-node-lx5cm" Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.135420 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/0cab307c-2039-45cc-b2ed-24f3c86eec92-ovnkube-script-lib\") pod \"ovnkube-node-lx5cm\" (UID: \"0cab307c-2039-45cc-b2ed-24f3c86eec92\") " pod="openshift-ovn-kubernetes/ovnkube-node-lx5cm" Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.135532 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/0cab307c-2039-45cc-b2ed-24f3c86eec92-systemd-units\") pod \"ovnkube-node-lx5cm\" (UID: \"0cab307c-2039-45cc-b2ed-24f3c86eec92\") " pod="openshift-ovn-kubernetes/ovnkube-node-lx5cm" Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.135578 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0cab307c-2039-45cc-b2ed-24f3c86eec92-host-run-ovn-kubernetes\") pod \"ovnkube-node-lx5cm\" (UID: \"0cab307c-2039-45cc-b2ed-24f3c86eec92\") " pod="openshift-ovn-kubernetes/ovnkube-node-lx5cm" Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.135692 4703 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8090d9aa-59c5-4c77-a4c0-94f2fa8d4426-host-run-netns\") on node \"crc\" DevicePath \"\"" Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.135724 4703 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/8090d9aa-59c5-4c77-a4c0-94f2fa8d4426-host-cni-netd\") on node \"crc\" DevicePath \"\"" Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.135747 4703 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8090d9aa-59c5-4c77-a4c0-94f2fa8d4426-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.135772 4703 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/8090d9aa-59c5-4c77-a4c0-94f2fa8d4426-host-kubelet\") on node \"crc\" DevicePath \"\"" Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.135792 4703 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/8090d9aa-59c5-4c77-a4c0-94f2fa8d4426-run-ovn\") on node \"crc\" DevicePath \"\"" Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.135811 4703 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/8090d9aa-59c5-4c77-a4c0-94f2fa8d4426-node-log\") on node \"crc\" DevicePath \"\"" Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.135833 4703 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/8090d9aa-59c5-4c77-a4c0-94f2fa8d4426-ovnkube-config\") on node \"crc\" DevicePath \"\"" Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.135854 4703 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8090d9aa-59c5-4c77-a4c0-94f2fa8d4426-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.135875 4703 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8090d9aa-59c5-4c77-a4c0-94f2fa8d4426-host-cni-bin\") on node \"crc\" DevicePath \"\"" Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.135896 4703 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8090d9aa-59c5-4c77-a4c0-94f2fa8d4426-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.135919 4703 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8090d9aa-59c5-4c77-a4c0-94f2fa8d4426-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.135941 4703 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/8090d9aa-59c5-4c77-a4c0-94f2fa8d4426-systemd-units\") on node \"crc\" DevicePath \"\"" Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.135962 4703 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/8090d9aa-59c5-4c77-a4c0-94f2fa8d4426-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.135984 4703 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/8090d9aa-59c5-4c77-a4c0-94f2fa8d4426-host-slash\") on node \"crc\" DevicePath \"\"" Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.136004 4703 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8090d9aa-59c5-4c77-a4c0-94f2fa8d4426-run-openvswitch\") on node \"crc\" DevicePath \"\"" Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.136028 4703 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/8090d9aa-59c5-4c77-a4c0-94f2fa8d4426-log-socket\") on node \"crc\" DevicePath \"\"" Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.136049 4703 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/8090d9aa-59c5-4c77-a4c0-94f2fa8d4426-env-overrides\") on node \"crc\" DevicePath \"\"" Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.139327 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8090d9aa-59c5-4c77-a4c0-94f2fa8d4426-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "8090d9aa-59c5-4c77-a4c0-94f2fa8d4426" (UID: "8090d9aa-59c5-4c77-a4c0-94f2fa8d4426"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.140284 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8090d9aa-59c5-4c77-a4c0-94f2fa8d4426-kube-api-access-jgh7c" (OuterVolumeSpecName: "kube-api-access-jgh7c") pod "8090d9aa-59c5-4c77-a4c0-94f2fa8d4426" (UID: "8090d9aa-59c5-4c77-a4c0-94f2fa8d4426"). InnerVolumeSpecName "kube-api-access-jgh7c". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.148789 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8090d9aa-59c5-4c77-a4c0-94f2fa8d4426-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "8090d9aa-59c5-4c77-a4c0-94f2fa8d4426" (UID: "8090d9aa-59c5-4c77-a4c0-94f2fa8d4426"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.236849 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/0cab307c-2039-45cc-b2ed-24f3c86eec92-env-overrides\") pod \"ovnkube-node-lx5cm\" (UID: \"0cab307c-2039-45cc-b2ed-24f3c86eec92\") " pod="openshift-ovn-kubernetes/ovnkube-node-lx5cm" Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.236952 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/0cab307c-2039-45cc-b2ed-24f3c86eec92-run-ovn\") pod \"ovnkube-node-lx5cm\" (UID: \"0cab307c-2039-45cc-b2ed-24f3c86eec92\") " pod="openshift-ovn-kubernetes/ovnkube-node-lx5cm" Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.237004 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0cab307c-2039-45cc-b2ed-24f3c86eec92-host-run-netns\") pod \"ovnkube-node-lx5cm\" (UID: \"0cab307c-2039-45cc-b2ed-24f3c86eec92\") " pod="openshift-ovn-kubernetes/ovnkube-node-lx5cm" Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.237056 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/0cab307c-2039-45cc-b2ed-24f3c86eec92-host-slash\") pod \"ovnkube-node-lx5cm\" (UID: \"0cab307c-2039-45cc-b2ed-24f3c86eec92\") " pod="openshift-ovn-kubernetes/ovnkube-node-lx5cm" Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.237081 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/0cab307c-2039-45cc-b2ed-24f3c86eec92-run-ovn\") pod \"ovnkube-node-lx5cm\" (UID: \"0cab307c-2039-45cc-b2ed-24f3c86eec92\") " pod="openshift-ovn-kubernetes/ovnkube-node-lx5cm" Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.237110 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/0cab307c-2039-45cc-b2ed-24f3c86eec92-ovnkube-script-lib\") pod \"ovnkube-node-lx5cm\" (UID: \"0cab307c-2039-45cc-b2ed-24f3c86eec92\") " pod="openshift-ovn-kubernetes/ovnkube-node-lx5cm" Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.237188 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0cab307c-2039-45cc-b2ed-24f3c86eec92-host-run-netns\") pod \"ovnkube-node-lx5cm\" (UID: \"0cab307c-2039-45cc-b2ed-24f3c86eec92\") " pod="openshift-ovn-kubernetes/ovnkube-node-lx5cm" Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.237235 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/0cab307c-2039-45cc-b2ed-24f3c86eec92-systemd-units\") pod \"ovnkube-node-lx5cm\" (UID: \"0cab307c-2039-45cc-b2ed-24f3c86eec92\") " pod="openshift-ovn-kubernetes/ovnkube-node-lx5cm" Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.237187 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/0cab307c-2039-45cc-b2ed-24f3c86eec92-host-slash\") pod \"ovnkube-node-lx5cm\" (UID: \"0cab307c-2039-45cc-b2ed-24f3c86eec92\") " pod="openshift-ovn-kubernetes/ovnkube-node-lx5cm" Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.237199 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/0cab307c-2039-45cc-b2ed-24f3c86eec92-systemd-units\") pod \"ovnkube-node-lx5cm\" (UID: \"0cab307c-2039-45cc-b2ed-24f3c86eec92\") " pod="openshift-ovn-kubernetes/ovnkube-node-lx5cm" Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.237327 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0cab307c-2039-45cc-b2ed-24f3c86eec92-host-run-ovn-kubernetes\") pod \"ovnkube-node-lx5cm\" (UID: \"0cab307c-2039-45cc-b2ed-24f3c86eec92\") " pod="openshift-ovn-kubernetes/ovnkube-node-lx5cm" Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.237371 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0cab307c-2039-45cc-b2ed-24f3c86eec92-var-lib-openvswitch\") pod \"ovnkube-node-lx5cm\" (UID: \"0cab307c-2039-45cc-b2ed-24f3c86eec92\") " pod="openshift-ovn-kubernetes/ovnkube-node-lx5cm" Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.237405 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/0cab307c-2039-45cc-b2ed-24f3c86eec92-log-socket\") pod \"ovnkube-node-lx5cm\" (UID: \"0cab307c-2039-45cc-b2ed-24f3c86eec92\") " pod="openshift-ovn-kubernetes/ovnkube-node-lx5cm" Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.237436 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/0cab307c-2039-45cc-b2ed-24f3c86eec92-host-kubelet\") pod \"ovnkube-node-lx5cm\" (UID: \"0cab307c-2039-45cc-b2ed-24f3c86eec92\") " pod="openshift-ovn-kubernetes/ovnkube-node-lx5cm" Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.237514 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0cab307c-2039-45cc-b2ed-24f3c86eec92-var-lib-openvswitch\") pod \"ovnkube-node-lx5cm\" (UID: \"0cab307c-2039-45cc-b2ed-24f3c86eec92\") " pod="openshift-ovn-kubernetes/ovnkube-node-lx5cm" Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.237515 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0cab307c-2039-45cc-b2ed-24f3c86eec92-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-lx5cm\" (UID: \"0cab307c-2039-45cc-b2ed-24f3c86eec92\") " pod="openshift-ovn-kubernetes/ovnkube-node-lx5cm" Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.237572 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0cab307c-2039-45cc-b2ed-24f3c86eec92-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-lx5cm\" (UID: \"0cab307c-2039-45cc-b2ed-24f3c86eec92\") " pod="openshift-ovn-kubernetes/ovnkube-node-lx5cm" Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.237599 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/0cab307c-2039-45cc-b2ed-24f3c86eec92-run-systemd\") pod \"ovnkube-node-lx5cm\" (UID: \"0cab307c-2039-45cc-b2ed-24f3c86eec92\") " pod="openshift-ovn-kubernetes/ovnkube-node-lx5cm" Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.237633 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0cab307c-2039-45cc-b2ed-24f3c86eec92-host-cni-bin\") pod \"ovnkube-node-lx5cm\" (UID: \"0cab307c-2039-45cc-b2ed-24f3c86eec92\") " pod="openshift-ovn-kubernetes/ovnkube-node-lx5cm" Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.237544 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0cab307c-2039-45cc-b2ed-24f3c86eec92-host-run-ovn-kubernetes\") pod \"ovnkube-node-lx5cm\" (UID: \"0cab307c-2039-45cc-b2ed-24f3c86eec92\") " pod="openshift-ovn-kubernetes/ovnkube-node-lx5cm" Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.237653 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/0cab307c-2039-45cc-b2ed-24f3c86eec92-run-systemd\") pod \"ovnkube-node-lx5cm\" (UID: \"0cab307c-2039-45cc-b2ed-24f3c86eec92\") " pod="openshift-ovn-kubernetes/ovnkube-node-lx5cm" Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.237655 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fmfdl\" (UniqueName: \"kubernetes.io/projected/0cab307c-2039-45cc-b2ed-24f3c86eec92-kube-api-access-fmfdl\") pod \"ovnkube-node-lx5cm\" (UID: \"0cab307c-2039-45cc-b2ed-24f3c86eec92\") " pod="openshift-ovn-kubernetes/ovnkube-node-lx5cm" Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.237692 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0cab307c-2039-45cc-b2ed-24f3c86eec92-host-cni-bin\") pod \"ovnkube-node-lx5cm\" (UID: \"0cab307c-2039-45cc-b2ed-24f3c86eec92\") " pod="openshift-ovn-kubernetes/ovnkube-node-lx5cm" Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.237647 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/0cab307c-2039-45cc-b2ed-24f3c86eec92-host-kubelet\") pod \"ovnkube-node-lx5cm\" (UID: \"0cab307c-2039-45cc-b2ed-24f3c86eec92\") " pod="openshift-ovn-kubernetes/ovnkube-node-lx5cm" Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.237730 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0cab307c-2039-45cc-b2ed-24f3c86eec92-run-openvswitch\") pod \"ovnkube-node-lx5cm\" (UID: \"0cab307c-2039-45cc-b2ed-24f3c86eec92\") " pod="openshift-ovn-kubernetes/ovnkube-node-lx5cm" Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.237757 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0cab307c-2039-45cc-b2ed-24f3c86eec92-run-openvswitch\") pod \"ovnkube-node-lx5cm\" (UID: \"0cab307c-2039-45cc-b2ed-24f3c86eec92\") " pod="openshift-ovn-kubernetes/ovnkube-node-lx5cm" Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.237771 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/0cab307c-2039-45cc-b2ed-24f3c86eec92-node-log\") pod \"ovnkube-node-lx5cm\" (UID: \"0cab307c-2039-45cc-b2ed-24f3c86eec92\") " pod="openshift-ovn-kubernetes/ovnkube-node-lx5cm" Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.237802 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/0cab307c-2039-45cc-b2ed-24f3c86eec92-ovnkube-config\") pod \"ovnkube-node-lx5cm\" (UID: \"0cab307c-2039-45cc-b2ed-24f3c86eec92\") " pod="openshift-ovn-kubernetes/ovnkube-node-lx5cm" Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.237845 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/0cab307c-2039-45cc-b2ed-24f3c86eec92-host-cni-netd\") pod \"ovnkube-node-lx5cm\" (UID: \"0cab307c-2039-45cc-b2ed-24f3c86eec92\") " pod="openshift-ovn-kubernetes/ovnkube-node-lx5cm" Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.237877 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/0cab307c-2039-45cc-b2ed-24f3c86eec92-host-cni-netd\") pod \"ovnkube-node-lx5cm\" (UID: \"0cab307c-2039-45cc-b2ed-24f3c86eec92\") " pod="openshift-ovn-kubernetes/ovnkube-node-lx5cm" Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.237886 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/0cab307c-2039-45cc-b2ed-24f3c86eec92-ovn-node-metrics-cert\") pod \"ovnkube-node-lx5cm\" (UID: \"0cab307c-2039-45cc-b2ed-24f3c86eec92\") " pod="openshift-ovn-kubernetes/ovnkube-node-lx5cm" Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.237856 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/0cab307c-2039-45cc-b2ed-24f3c86eec92-node-log\") pod \"ovnkube-node-lx5cm\" (UID: \"0cab307c-2039-45cc-b2ed-24f3c86eec92\") " pod="openshift-ovn-kubernetes/ovnkube-node-lx5cm" Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.237941 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0cab307c-2039-45cc-b2ed-24f3c86eec92-etc-openvswitch\") pod \"ovnkube-node-lx5cm\" (UID: \"0cab307c-2039-45cc-b2ed-24f3c86eec92\") " pod="openshift-ovn-kubernetes/ovnkube-node-lx5cm" Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.237525 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/0cab307c-2039-45cc-b2ed-24f3c86eec92-log-socket\") pod \"ovnkube-node-lx5cm\" (UID: \"0cab307c-2039-45cc-b2ed-24f3c86eec92\") " pod="openshift-ovn-kubernetes/ovnkube-node-lx5cm" Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.238050 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0cab307c-2039-45cc-b2ed-24f3c86eec92-etc-openvswitch\") pod \"ovnkube-node-lx5cm\" (UID: \"0cab307c-2039-45cc-b2ed-24f3c86eec92\") " pod="openshift-ovn-kubernetes/ovnkube-node-lx5cm" Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.238097 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/0cab307c-2039-45cc-b2ed-24f3c86eec92-env-overrides\") pod \"ovnkube-node-lx5cm\" (UID: \"0cab307c-2039-45cc-b2ed-24f3c86eec92\") " pod="openshift-ovn-kubernetes/ovnkube-node-lx5cm" Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.238133 4703 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/8090d9aa-59c5-4c77-a4c0-94f2fa8d4426-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.238162 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jgh7c\" (UniqueName: \"kubernetes.io/projected/8090d9aa-59c5-4c77-a4c0-94f2fa8d4426-kube-api-access-jgh7c\") on node \"crc\" DevicePath \"\"" Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.238184 4703 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/8090d9aa-59c5-4c77-a4c0-94f2fa8d4426-run-systemd\") on node \"crc\" DevicePath \"\"" Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.238500 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/0cab307c-2039-45cc-b2ed-24f3c86eec92-ovnkube-script-lib\") pod \"ovnkube-node-lx5cm\" (UID: \"0cab307c-2039-45cc-b2ed-24f3c86eec92\") " pod="openshift-ovn-kubernetes/ovnkube-node-lx5cm" Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.238839 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/0cab307c-2039-45cc-b2ed-24f3c86eec92-ovnkube-config\") pod \"ovnkube-node-lx5cm\" (UID: \"0cab307c-2039-45cc-b2ed-24f3c86eec92\") " pod="openshift-ovn-kubernetes/ovnkube-node-lx5cm" Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.242432 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/0cab307c-2039-45cc-b2ed-24f3c86eec92-ovn-node-metrics-cert\") pod \"ovnkube-node-lx5cm\" (UID: \"0cab307c-2039-45cc-b2ed-24f3c86eec92\") " pod="openshift-ovn-kubernetes/ovnkube-node-lx5cm" Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.256133 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fmfdl\" (UniqueName: \"kubernetes.io/projected/0cab307c-2039-45cc-b2ed-24f3c86eec92-kube-api-access-fmfdl\") pod \"ovnkube-node-lx5cm\" (UID: \"0cab307c-2039-45cc-b2ed-24f3c86eec92\") " pod="openshift-ovn-kubernetes/ovnkube-node-lx5cm" Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.369043 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-lx5cm" Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.588701 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-vgqng_6b6de354-b085-4f66-ac6c-4eb6005aa965/kube-multus/0.log" Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.588890 4703 generic.go:334] "Generic (PLEG): container finished" podID="6b6de354-b085-4f66-ac6c-4eb6005aa965" containerID="d87e8dad79964dbc6e834c6da67c25f96267dc752ed1f3c8f1608ae7629d60e2" exitCode=2 Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.588997 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-vgqng" event={"ID":"6b6de354-b085-4f66-ac6c-4eb6005aa965","Type":"ContainerDied","Data":"d87e8dad79964dbc6e834c6da67c25f96267dc752ed1f3c8f1608ae7629d60e2"} Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.589879 4703 scope.go:117] "RemoveContainer" containerID="d87e8dad79964dbc6e834c6da67c25f96267dc752ed1f3c8f1608ae7629d60e2" Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.590982 4703 generic.go:334] "Generic (PLEG): container finished" podID="0cab307c-2039-45cc-b2ed-24f3c86eec92" containerID="5dcaa4c6de266831942eb84afd24ad8a4050565df9c5eca0ab54db9e3e123f3e" exitCode=0 Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.591030 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lx5cm" event={"ID":"0cab307c-2039-45cc-b2ed-24f3c86eec92","Type":"ContainerDied","Data":"5dcaa4c6de266831942eb84afd24ad8a4050565df9c5eca0ab54db9e3e123f3e"} Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.591091 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lx5cm" event={"ID":"0cab307c-2039-45cc-b2ed-24f3c86eec92","Type":"ContainerStarted","Data":"e12e16bf20cb8a8ba04d763c3736aacd17e1482c4b3760dcaa56f23fe4085917"} Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.600118 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-m4jc5_8090d9aa-59c5-4c77-a4c0-94f2fa8d4426/ovn-acl-logging/0.log" Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.600774 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-m4jc5_8090d9aa-59c5-4c77-a4c0-94f2fa8d4426/ovn-controller/0.log" Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.601789 4703 generic.go:334] "Generic (PLEG): container finished" podID="8090d9aa-59c5-4c77-a4c0-94f2fa8d4426" containerID="de794e2f163f11d0639196abf1633be939f13e7a6925907982886f60f0b6decb" exitCode=0 Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.601910 4703 generic.go:334] "Generic (PLEG): container finished" podID="8090d9aa-59c5-4c77-a4c0-94f2fa8d4426" containerID="7327b4c65a37d4d38fd797a79eada88617bce15e8ca9ea2145f1cd8b2e7a4ea5" exitCode=0 Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.601989 4703 generic.go:334] "Generic (PLEG): container finished" podID="8090d9aa-59c5-4c77-a4c0-94f2fa8d4426" containerID="2c7f1264e75bed906a827c0eeedcf3120809c53ede363c06629d4219ed82aa42" exitCode=0 Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.602068 4703 generic.go:334] "Generic (PLEG): container finished" podID="8090d9aa-59c5-4c77-a4c0-94f2fa8d4426" containerID="f275d204a8d4c20a7e6b6e7897724b671daa662a53ec87bdef257935fe5fbb07" exitCode=0 Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.602156 4703 generic.go:334] "Generic (PLEG): container finished" podID="8090d9aa-59c5-4c77-a4c0-94f2fa8d4426" containerID="b95e578049157330f58c94e99a339d76cb71eb6672860a9c1364ae204fdc5dc6" exitCode=0 Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.602248 4703 generic.go:334] "Generic (PLEG): container finished" podID="8090d9aa-59c5-4c77-a4c0-94f2fa8d4426" containerID="cc87197c1fda9865f81c76cb123bdf355d8fea22b3b3c3c1100a3bc375481b8a" exitCode=0 Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.602331 4703 generic.go:334] "Generic (PLEG): container finished" podID="8090d9aa-59c5-4c77-a4c0-94f2fa8d4426" containerID="5acab90dda80ca9a5f26446486eea0689403ad891b85d765af6a439181aa47f2" exitCode=143 Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.602408 4703 generic.go:334] "Generic (PLEG): container finished" podID="8090d9aa-59c5-4c77-a4c0-94f2fa8d4426" containerID="00e6bcaa58d0cbca2d738a616bb0a5bbf22d4563f2851f11e2c995b08a080789" exitCode=143 Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.601910 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m4jc5" event={"ID":"8090d9aa-59c5-4c77-a4c0-94f2fa8d4426","Type":"ContainerDied","Data":"de794e2f163f11d0639196abf1633be939f13e7a6925907982886f60f0b6decb"} Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.602681 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m4jc5" event={"ID":"8090d9aa-59c5-4c77-a4c0-94f2fa8d4426","Type":"ContainerDied","Data":"7327b4c65a37d4d38fd797a79eada88617bce15e8ca9ea2145f1cd8b2e7a4ea5"} Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.602800 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m4jc5" event={"ID":"8090d9aa-59c5-4c77-a4c0-94f2fa8d4426","Type":"ContainerDied","Data":"2c7f1264e75bed906a827c0eeedcf3120809c53ede363c06629d4219ed82aa42"} Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.602886 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m4jc5" event={"ID":"8090d9aa-59c5-4c77-a4c0-94f2fa8d4426","Type":"ContainerDied","Data":"f275d204a8d4c20a7e6b6e7897724b671daa662a53ec87bdef257935fe5fbb07"} Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.602747 4703 scope.go:117] "RemoveContainer" containerID="de794e2f163f11d0639196abf1633be939f13e7a6925907982886f60f0b6decb" Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.601891 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-m4jc5" Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.602968 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m4jc5" event={"ID":"8090d9aa-59c5-4c77-a4c0-94f2fa8d4426","Type":"ContainerDied","Data":"b95e578049157330f58c94e99a339d76cb71eb6672860a9c1364ae204fdc5dc6"} Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.603207 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m4jc5" event={"ID":"8090d9aa-59c5-4c77-a4c0-94f2fa8d4426","Type":"ContainerDied","Data":"cc87197c1fda9865f81c76cb123bdf355d8fea22b3b3c3c1100a3bc375481b8a"} Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.603241 4703 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5acab90dda80ca9a5f26446486eea0689403ad891b85d765af6a439181aa47f2"} Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.603262 4703 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"00e6bcaa58d0cbca2d738a616bb0a5bbf22d4563f2851f11e2c995b08a080789"} Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.603275 4703 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"295892d24c4de9ffda31b2be4cb00fdf1a108088677abebd05b7ab98c0f722f5"} Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.603291 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m4jc5" event={"ID":"8090d9aa-59c5-4c77-a4c0-94f2fa8d4426","Type":"ContainerDied","Data":"5acab90dda80ca9a5f26446486eea0689403ad891b85d765af6a439181aa47f2"} Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.603307 4703 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"de794e2f163f11d0639196abf1633be939f13e7a6925907982886f60f0b6decb"} Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.603320 4703 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7327b4c65a37d4d38fd797a79eada88617bce15e8ca9ea2145f1cd8b2e7a4ea5"} Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.603331 4703 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2c7f1264e75bed906a827c0eeedcf3120809c53ede363c06629d4219ed82aa42"} Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.603341 4703 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f275d204a8d4c20a7e6b6e7897724b671daa662a53ec87bdef257935fe5fbb07"} Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.603352 4703 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b95e578049157330f58c94e99a339d76cb71eb6672860a9c1364ae204fdc5dc6"} Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.603363 4703 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"cc87197c1fda9865f81c76cb123bdf355d8fea22b3b3c3c1100a3bc375481b8a"} Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.603374 4703 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5acab90dda80ca9a5f26446486eea0689403ad891b85d765af6a439181aa47f2"} Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.603388 4703 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"00e6bcaa58d0cbca2d738a616bb0a5bbf22d4563f2851f11e2c995b08a080789"} Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.603399 4703 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"295892d24c4de9ffda31b2be4cb00fdf1a108088677abebd05b7ab98c0f722f5"} Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.603413 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m4jc5" event={"ID":"8090d9aa-59c5-4c77-a4c0-94f2fa8d4426","Type":"ContainerDied","Data":"00e6bcaa58d0cbca2d738a616bb0a5bbf22d4563f2851f11e2c995b08a080789"} Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.603428 4703 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"de794e2f163f11d0639196abf1633be939f13e7a6925907982886f60f0b6decb"} Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.603442 4703 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7327b4c65a37d4d38fd797a79eada88617bce15e8ca9ea2145f1cd8b2e7a4ea5"} Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.603453 4703 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2c7f1264e75bed906a827c0eeedcf3120809c53ede363c06629d4219ed82aa42"} Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.603499 4703 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f275d204a8d4c20a7e6b6e7897724b671daa662a53ec87bdef257935fe5fbb07"} Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.603531 4703 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b95e578049157330f58c94e99a339d76cb71eb6672860a9c1364ae204fdc5dc6"} Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.603553 4703 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"cc87197c1fda9865f81c76cb123bdf355d8fea22b3b3c3c1100a3bc375481b8a"} Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.603568 4703 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5acab90dda80ca9a5f26446486eea0689403ad891b85d765af6a439181aa47f2"} Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.603582 4703 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"00e6bcaa58d0cbca2d738a616bb0a5bbf22d4563f2851f11e2c995b08a080789"} Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.603596 4703 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"295892d24c4de9ffda31b2be4cb00fdf1a108088677abebd05b7ab98c0f722f5"} Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.603618 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m4jc5" event={"ID":"8090d9aa-59c5-4c77-a4c0-94f2fa8d4426","Type":"ContainerDied","Data":"377cc690dc6da19f1109e489b1b681e74bce4e48325c37bd421da64d645ae93b"} Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.603648 4703 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"de794e2f163f11d0639196abf1633be939f13e7a6925907982886f60f0b6decb"} Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.603662 4703 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7327b4c65a37d4d38fd797a79eada88617bce15e8ca9ea2145f1cd8b2e7a4ea5"} Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.603673 4703 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2c7f1264e75bed906a827c0eeedcf3120809c53ede363c06629d4219ed82aa42"} Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.603683 4703 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f275d204a8d4c20a7e6b6e7897724b671daa662a53ec87bdef257935fe5fbb07"} Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.603694 4703 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b95e578049157330f58c94e99a339d76cb71eb6672860a9c1364ae204fdc5dc6"} Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.603704 4703 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"cc87197c1fda9865f81c76cb123bdf355d8fea22b3b3c3c1100a3bc375481b8a"} Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.603715 4703 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5acab90dda80ca9a5f26446486eea0689403ad891b85d765af6a439181aa47f2"} Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.603725 4703 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"00e6bcaa58d0cbca2d738a616bb0a5bbf22d4563f2851f11e2c995b08a080789"} Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.603735 4703 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"295892d24c4de9ffda31b2be4cb00fdf1a108088677abebd05b7ab98c0f722f5"} Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.650799 4703 scope.go:117] "RemoveContainer" containerID="7327b4c65a37d4d38fd797a79eada88617bce15e8ca9ea2145f1cd8b2e7a4ea5" Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.685672 4703 scope.go:117] "RemoveContainer" containerID="2c7f1264e75bed906a827c0eeedcf3120809c53ede363c06629d4219ed82aa42" Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.695542 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-m4jc5"] Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.698267 4703 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-m4jc5"] Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.710166 4703 scope.go:117] "RemoveContainer" containerID="f275d204a8d4c20a7e6b6e7897724b671daa662a53ec87bdef257935fe5fbb07" Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.725723 4703 scope.go:117] "RemoveContainer" containerID="b95e578049157330f58c94e99a339d76cb71eb6672860a9c1364ae204fdc5dc6" Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.751041 4703 scope.go:117] "RemoveContainer" containerID="cc87197c1fda9865f81c76cb123bdf355d8fea22b3b3c3c1100a3bc375481b8a" Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.773729 4703 scope.go:117] "RemoveContainer" containerID="5acab90dda80ca9a5f26446486eea0689403ad891b85d765af6a439181aa47f2" Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.811552 4703 scope.go:117] "RemoveContainer" containerID="00e6bcaa58d0cbca2d738a616bb0a5bbf22d4563f2851f11e2c995b08a080789" Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.836941 4703 scope.go:117] "RemoveContainer" containerID="295892d24c4de9ffda31b2be4cb00fdf1a108088677abebd05b7ab98c0f722f5" Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.855950 4703 scope.go:117] "RemoveContainer" containerID="de794e2f163f11d0639196abf1633be939f13e7a6925907982886f60f0b6decb" Oct 11 04:03:22 crc kubenswrapper[4703]: E1011 04:03:22.856616 4703 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"de794e2f163f11d0639196abf1633be939f13e7a6925907982886f60f0b6decb\": container with ID starting with de794e2f163f11d0639196abf1633be939f13e7a6925907982886f60f0b6decb not found: ID does not exist" containerID="de794e2f163f11d0639196abf1633be939f13e7a6925907982886f60f0b6decb" Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.856655 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de794e2f163f11d0639196abf1633be939f13e7a6925907982886f60f0b6decb"} err="failed to get container status \"de794e2f163f11d0639196abf1633be939f13e7a6925907982886f60f0b6decb\": rpc error: code = NotFound desc = could not find container \"de794e2f163f11d0639196abf1633be939f13e7a6925907982886f60f0b6decb\": container with ID starting with de794e2f163f11d0639196abf1633be939f13e7a6925907982886f60f0b6decb not found: ID does not exist" Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.856680 4703 scope.go:117] "RemoveContainer" containerID="7327b4c65a37d4d38fd797a79eada88617bce15e8ca9ea2145f1cd8b2e7a4ea5" Oct 11 04:03:22 crc kubenswrapper[4703]: E1011 04:03:22.857021 4703 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7327b4c65a37d4d38fd797a79eada88617bce15e8ca9ea2145f1cd8b2e7a4ea5\": container with ID starting with 7327b4c65a37d4d38fd797a79eada88617bce15e8ca9ea2145f1cd8b2e7a4ea5 not found: ID does not exist" containerID="7327b4c65a37d4d38fd797a79eada88617bce15e8ca9ea2145f1cd8b2e7a4ea5" Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.857096 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7327b4c65a37d4d38fd797a79eada88617bce15e8ca9ea2145f1cd8b2e7a4ea5"} err="failed to get container status \"7327b4c65a37d4d38fd797a79eada88617bce15e8ca9ea2145f1cd8b2e7a4ea5\": rpc error: code = NotFound desc = could not find container \"7327b4c65a37d4d38fd797a79eada88617bce15e8ca9ea2145f1cd8b2e7a4ea5\": container with ID starting with 7327b4c65a37d4d38fd797a79eada88617bce15e8ca9ea2145f1cd8b2e7a4ea5 not found: ID does not exist" Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.857132 4703 scope.go:117] "RemoveContainer" containerID="2c7f1264e75bed906a827c0eeedcf3120809c53ede363c06629d4219ed82aa42" Oct 11 04:03:22 crc kubenswrapper[4703]: E1011 04:03:22.857637 4703 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2c7f1264e75bed906a827c0eeedcf3120809c53ede363c06629d4219ed82aa42\": container with ID starting with 2c7f1264e75bed906a827c0eeedcf3120809c53ede363c06629d4219ed82aa42 not found: ID does not exist" containerID="2c7f1264e75bed906a827c0eeedcf3120809c53ede363c06629d4219ed82aa42" Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.857674 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2c7f1264e75bed906a827c0eeedcf3120809c53ede363c06629d4219ed82aa42"} err="failed to get container status \"2c7f1264e75bed906a827c0eeedcf3120809c53ede363c06629d4219ed82aa42\": rpc error: code = NotFound desc = could not find container \"2c7f1264e75bed906a827c0eeedcf3120809c53ede363c06629d4219ed82aa42\": container with ID starting with 2c7f1264e75bed906a827c0eeedcf3120809c53ede363c06629d4219ed82aa42 not found: ID does not exist" Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.857695 4703 scope.go:117] "RemoveContainer" containerID="f275d204a8d4c20a7e6b6e7897724b671daa662a53ec87bdef257935fe5fbb07" Oct 11 04:03:22 crc kubenswrapper[4703]: E1011 04:03:22.858018 4703 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f275d204a8d4c20a7e6b6e7897724b671daa662a53ec87bdef257935fe5fbb07\": container with ID starting with f275d204a8d4c20a7e6b6e7897724b671daa662a53ec87bdef257935fe5fbb07 not found: ID does not exist" containerID="f275d204a8d4c20a7e6b6e7897724b671daa662a53ec87bdef257935fe5fbb07" Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.858048 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f275d204a8d4c20a7e6b6e7897724b671daa662a53ec87bdef257935fe5fbb07"} err="failed to get container status \"f275d204a8d4c20a7e6b6e7897724b671daa662a53ec87bdef257935fe5fbb07\": rpc error: code = NotFound desc = could not find container \"f275d204a8d4c20a7e6b6e7897724b671daa662a53ec87bdef257935fe5fbb07\": container with ID starting with f275d204a8d4c20a7e6b6e7897724b671daa662a53ec87bdef257935fe5fbb07 not found: ID does not exist" Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.858071 4703 scope.go:117] "RemoveContainer" containerID="b95e578049157330f58c94e99a339d76cb71eb6672860a9c1364ae204fdc5dc6" Oct 11 04:03:22 crc kubenswrapper[4703]: E1011 04:03:22.858332 4703 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b95e578049157330f58c94e99a339d76cb71eb6672860a9c1364ae204fdc5dc6\": container with ID starting with b95e578049157330f58c94e99a339d76cb71eb6672860a9c1364ae204fdc5dc6 not found: ID does not exist" containerID="b95e578049157330f58c94e99a339d76cb71eb6672860a9c1364ae204fdc5dc6" Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.858358 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b95e578049157330f58c94e99a339d76cb71eb6672860a9c1364ae204fdc5dc6"} err="failed to get container status \"b95e578049157330f58c94e99a339d76cb71eb6672860a9c1364ae204fdc5dc6\": rpc error: code = NotFound desc = could not find container \"b95e578049157330f58c94e99a339d76cb71eb6672860a9c1364ae204fdc5dc6\": container with ID starting with b95e578049157330f58c94e99a339d76cb71eb6672860a9c1364ae204fdc5dc6 not found: ID does not exist" Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.858378 4703 scope.go:117] "RemoveContainer" containerID="cc87197c1fda9865f81c76cb123bdf355d8fea22b3b3c3c1100a3bc375481b8a" Oct 11 04:03:22 crc kubenswrapper[4703]: E1011 04:03:22.858649 4703 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cc87197c1fda9865f81c76cb123bdf355d8fea22b3b3c3c1100a3bc375481b8a\": container with ID starting with cc87197c1fda9865f81c76cb123bdf355d8fea22b3b3c3c1100a3bc375481b8a not found: ID does not exist" containerID="cc87197c1fda9865f81c76cb123bdf355d8fea22b3b3c3c1100a3bc375481b8a" Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.858676 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cc87197c1fda9865f81c76cb123bdf355d8fea22b3b3c3c1100a3bc375481b8a"} err="failed to get container status \"cc87197c1fda9865f81c76cb123bdf355d8fea22b3b3c3c1100a3bc375481b8a\": rpc error: code = NotFound desc = could not find container \"cc87197c1fda9865f81c76cb123bdf355d8fea22b3b3c3c1100a3bc375481b8a\": container with ID starting with cc87197c1fda9865f81c76cb123bdf355d8fea22b3b3c3c1100a3bc375481b8a not found: ID does not exist" Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.858695 4703 scope.go:117] "RemoveContainer" containerID="5acab90dda80ca9a5f26446486eea0689403ad891b85d765af6a439181aa47f2" Oct 11 04:03:22 crc kubenswrapper[4703]: E1011 04:03:22.858903 4703 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5acab90dda80ca9a5f26446486eea0689403ad891b85d765af6a439181aa47f2\": container with ID starting with 5acab90dda80ca9a5f26446486eea0689403ad891b85d765af6a439181aa47f2 not found: ID does not exist" containerID="5acab90dda80ca9a5f26446486eea0689403ad891b85d765af6a439181aa47f2" Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.858932 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5acab90dda80ca9a5f26446486eea0689403ad891b85d765af6a439181aa47f2"} err="failed to get container status \"5acab90dda80ca9a5f26446486eea0689403ad891b85d765af6a439181aa47f2\": rpc error: code = NotFound desc = could not find container \"5acab90dda80ca9a5f26446486eea0689403ad891b85d765af6a439181aa47f2\": container with ID starting with 5acab90dda80ca9a5f26446486eea0689403ad891b85d765af6a439181aa47f2 not found: ID does not exist" Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.858949 4703 scope.go:117] "RemoveContainer" containerID="00e6bcaa58d0cbca2d738a616bb0a5bbf22d4563f2851f11e2c995b08a080789" Oct 11 04:03:22 crc kubenswrapper[4703]: E1011 04:03:22.859343 4703 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"00e6bcaa58d0cbca2d738a616bb0a5bbf22d4563f2851f11e2c995b08a080789\": container with ID starting with 00e6bcaa58d0cbca2d738a616bb0a5bbf22d4563f2851f11e2c995b08a080789 not found: ID does not exist" containerID="00e6bcaa58d0cbca2d738a616bb0a5bbf22d4563f2851f11e2c995b08a080789" Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.859370 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"00e6bcaa58d0cbca2d738a616bb0a5bbf22d4563f2851f11e2c995b08a080789"} err="failed to get container status \"00e6bcaa58d0cbca2d738a616bb0a5bbf22d4563f2851f11e2c995b08a080789\": rpc error: code = NotFound desc = could not find container \"00e6bcaa58d0cbca2d738a616bb0a5bbf22d4563f2851f11e2c995b08a080789\": container with ID starting with 00e6bcaa58d0cbca2d738a616bb0a5bbf22d4563f2851f11e2c995b08a080789 not found: ID does not exist" Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.859387 4703 scope.go:117] "RemoveContainer" containerID="295892d24c4de9ffda31b2be4cb00fdf1a108088677abebd05b7ab98c0f722f5" Oct 11 04:03:22 crc kubenswrapper[4703]: E1011 04:03:22.859644 4703 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"295892d24c4de9ffda31b2be4cb00fdf1a108088677abebd05b7ab98c0f722f5\": container with ID starting with 295892d24c4de9ffda31b2be4cb00fdf1a108088677abebd05b7ab98c0f722f5 not found: ID does not exist" containerID="295892d24c4de9ffda31b2be4cb00fdf1a108088677abebd05b7ab98c0f722f5" Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.859666 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"295892d24c4de9ffda31b2be4cb00fdf1a108088677abebd05b7ab98c0f722f5"} err="failed to get container status \"295892d24c4de9ffda31b2be4cb00fdf1a108088677abebd05b7ab98c0f722f5\": rpc error: code = NotFound desc = could not find container \"295892d24c4de9ffda31b2be4cb00fdf1a108088677abebd05b7ab98c0f722f5\": container with ID starting with 295892d24c4de9ffda31b2be4cb00fdf1a108088677abebd05b7ab98c0f722f5 not found: ID does not exist" Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.859688 4703 scope.go:117] "RemoveContainer" containerID="de794e2f163f11d0639196abf1633be939f13e7a6925907982886f60f0b6decb" Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.859967 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de794e2f163f11d0639196abf1633be939f13e7a6925907982886f60f0b6decb"} err="failed to get container status \"de794e2f163f11d0639196abf1633be939f13e7a6925907982886f60f0b6decb\": rpc error: code = NotFound desc = could not find container \"de794e2f163f11d0639196abf1633be939f13e7a6925907982886f60f0b6decb\": container with ID starting with de794e2f163f11d0639196abf1633be939f13e7a6925907982886f60f0b6decb not found: ID does not exist" Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.859994 4703 scope.go:117] "RemoveContainer" containerID="7327b4c65a37d4d38fd797a79eada88617bce15e8ca9ea2145f1cd8b2e7a4ea5" Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.860262 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7327b4c65a37d4d38fd797a79eada88617bce15e8ca9ea2145f1cd8b2e7a4ea5"} err="failed to get container status \"7327b4c65a37d4d38fd797a79eada88617bce15e8ca9ea2145f1cd8b2e7a4ea5\": rpc error: code = NotFound desc = could not find container \"7327b4c65a37d4d38fd797a79eada88617bce15e8ca9ea2145f1cd8b2e7a4ea5\": container with ID starting with 7327b4c65a37d4d38fd797a79eada88617bce15e8ca9ea2145f1cd8b2e7a4ea5 not found: ID does not exist" Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.860296 4703 scope.go:117] "RemoveContainer" containerID="2c7f1264e75bed906a827c0eeedcf3120809c53ede363c06629d4219ed82aa42" Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.860602 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2c7f1264e75bed906a827c0eeedcf3120809c53ede363c06629d4219ed82aa42"} err="failed to get container status \"2c7f1264e75bed906a827c0eeedcf3120809c53ede363c06629d4219ed82aa42\": rpc error: code = NotFound desc = could not find container \"2c7f1264e75bed906a827c0eeedcf3120809c53ede363c06629d4219ed82aa42\": container with ID starting with 2c7f1264e75bed906a827c0eeedcf3120809c53ede363c06629d4219ed82aa42 not found: ID does not exist" Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.860677 4703 scope.go:117] "RemoveContainer" containerID="f275d204a8d4c20a7e6b6e7897724b671daa662a53ec87bdef257935fe5fbb07" Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.860919 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f275d204a8d4c20a7e6b6e7897724b671daa662a53ec87bdef257935fe5fbb07"} err="failed to get container status \"f275d204a8d4c20a7e6b6e7897724b671daa662a53ec87bdef257935fe5fbb07\": rpc error: code = NotFound desc = could not find container \"f275d204a8d4c20a7e6b6e7897724b671daa662a53ec87bdef257935fe5fbb07\": container with ID starting with f275d204a8d4c20a7e6b6e7897724b671daa662a53ec87bdef257935fe5fbb07 not found: ID does not exist" Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.860948 4703 scope.go:117] "RemoveContainer" containerID="b95e578049157330f58c94e99a339d76cb71eb6672860a9c1364ae204fdc5dc6" Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.861295 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b95e578049157330f58c94e99a339d76cb71eb6672860a9c1364ae204fdc5dc6"} err="failed to get container status \"b95e578049157330f58c94e99a339d76cb71eb6672860a9c1364ae204fdc5dc6\": rpc error: code = NotFound desc = could not find container \"b95e578049157330f58c94e99a339d76cb71eb6672860a9c1364ae204fdc5dc6\": container with ID starting with b95e578049157330f58c94e99a339d76cb71eb6672860a9c1364ae204fdc5dc6 not found: ID does not exist" Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.861321 4703 scope.go:117] "RemoveContainer" containerID="cc87197c1fda9865f81c76cb123bdf355d8fea22b3b3c3c1100a3bc375481b8a" Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.861618 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cc87197c1fda9865f81c76cb123bdf355d8fea22b3b3c3c1100a3bc375481b8a"} err="failed to get container status \"cc87197c1fda9865f81c76cb123bdf355d8fea22b3b3c3c1100a3bc375481b8a\": rpc error: code = NotFound desc = could not find container \"cc87197c1fda9865f81c76cb123bdf355d8fea22b3b3c3c1100a3bc375481b8a\": container with ID starting with cc87197c1fda9865f81c76cb123bdf355d8fea22b3b3c3c1100a3bc375481b8a not found: ID does not exist" Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.861645 4703 scope.go:117] "RemoveContainer" containerID="5acab90dda80ca9a5f26446486eea0689403ad891b85d765af6a439181aa47f2" Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.862059 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5acab90dda80ca9a5f26446486eea0689403ad891b85d765af6a439181aa47f2"} err="failed to get container status \"5acab90dda80ca9a5f26446486eea0689403ad891b85d765af6a439181aa47f2\": rpc error: code = NotFound desc = could not find container \"5acab90dda80ca9a5f26446486eea0689403ad891b85d765af6a439181aa47f2\": container with ID starting with 5acab90dda80ca9a5f26446486eea0689403ad891b85d765af6a439181aa47f2 not found: ID does not exist" Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.862081 4703 scope.go:117] "RemoveContainer" containerID="00e6bcaa58d0cbca2d738a616bb0a5bbf22d4563f2851f11e2c995b08a080789" Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.862567 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"00e6bcaa58d0cbca2d738a616bb0a5bbf22d4563f2851f11e2c995b08a080789"} err="failed to get container status \"00e6bcaa58d0cbca2d738a616bb0a5bbf22d4563f2851f11e2c995b08a080789\": rpc error: code = NotFound desc = could not find container \"00e6bcaa58d0cbca2d738a616bb0a5bbf22d4563f2851f11e2c995b08a080789\": container with ID starting with 00e6bcaa58d0cbca2d738a616bb0a5bbf22d4563f2851f11e2c995b08a080789 not found: ID does not exist" Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.862589 4703 scope.go:117] "RemoveContainer" containerID="295892d24c4de9ffda31b2be4cb00fdf1a108088677abebd05b7ab98c0f722f5" Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.862910 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"295892d24c4de9ffda31b2be4cb00fdf1a108088677abebd05b7ab98c0f722f5"} err="failed to get container status \"295892d24c4de9ffda31b2be4cb00fdf1a108088677abebd05b7ab98c0f722f5\": rpc error: code = NotFound desc = could not find container \"295892d24c4de9ffda31b2be4cb00fdf1a108088677abebd05b7ab98c0f722f5\": container with ID starting with 295892d24c4de9ffda31b2be4cb00fdf1a108088677abebd05b7ab98c0f722f5 not found: ID does not exist" Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.862972 4703 scope.go:117] "RemoveContainer" containerID="de794e2f163f11d0639196abf1633be939f13e7a6925907982886f60f0b6decb" Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.863259 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de794e2f163f11d0639196abf1633be939f13e7a6925907982886f60f0b6decb"} err="failed to get container status \"de794e2f163f11d0639196abf1633be939f13e7a6925907982886f60f0b6decb\": rpc error: code = NotFound desc = could not find container \"de794e2f163f11d0639196abf1633be939f13e7a6925907982886f60f0b6decb\": container with ID starting with de794e2f163f11d0639196abf1633be939f13e7a6925907982886f60f0b6decb not found: ID does not exist" Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.863275 4703 scope.go:117] "RemoveContainer" containerID="7327b4c65a37d4d38fd797a79eada88617bce15e8ca9ea2145f1cd8b2e7a4ea5" Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.863590 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7327b4c65a37d4d38fd797a79eada88617bce15e8ca9ea2145f1cd8b2e7a4ea5"} err="failed to get container status \"7327b4c65a37d4d38fd797a79eada88617bce15e8ca9ea2145f1cd8b2e7a4ea5\": rpc error: code = NotFound desc = could not find container \"7327b4c65a37d4d38fd797a79eada88617bce15e8ca9ea2145f1cd8b2e7a4ea5\": container with ID starting with 7327b4c65a37d4d38fd797a79eada88617bce15e8ca9ea2145f1cd8b2e7a4ea5 not found: ID does not exist" Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.863614 4703 scope.go:117] "RemoveContainer" containerID="2c7f1264e75bed906a827c0eeedcf3120809c53ede363c06629d4219ed82aa42" Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.863883 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2c7f1264e75bed906a827c0eeedcf3120809c53ede363c06629d4219ed82aa42"} err="failed to get container status \"2c7f1264e75bed906a827c0eeedcf3120809c53ede363c06629d4219ed82aa42\": rpc error: code = NotFound desc = could not find container \"2c7f1264e75bed906a827c0eeedcf3120809c53ede363c06629d4219ed82aa42\": container with ID starting with 2c7f1264e75bed906a827c0eeedcf3120809c53ede363c06629d4219ed82aa42 not found: ID does not exist" Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.863909 4703 scope.go:117] "RemoveContainer" containerID="f275d204a8d4c20a7e6b6e7897724b671daa662a53ec87bdef257935fe5fbb07" Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.864192 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f275d204a8d4c20a7e6b6e7897724b671daa662a53ec87bdef257935fe5fbb07"} err="failed to get container status \"f275d204a8d4c20a7e6b6e7897724b671daa662a53ec87bdef257935fe5fbb07\": rpc error: code = NotFound desc = could not find container \"f275d204a8d4c20a7e6b6e7897724b671daa662a53ec87bdef257935fe5fbb07\": container with ID starting with f275d204a8d4c20a7e6b6e7897724b671daa662a53ec87bdef257935fe5fbb07 not found: ID does not exist" Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.864213 4703 scope.go:117] "RemoveContainer" containerID="b95e578049157330f58c94e99a339d76cb71eb6672860a9c1364ae204fdc5dc6" Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.864418 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b95e578049157330f58c94e99a339d76cb71eb6672860a9c1364ae204fdc5dc6"} err="failed to get container status \"b95e578049157330f58c94e99a339d76cb71eb6672860a9c1364ae204fdc5dc6\": rpc error: code = NotFound desc = could not find container \"b95e578049157330f58c94e99a339d76cb71eb6672860a9c1364ae204fdc5dc6\": container with ID starting with b95e578049157330f58c94e99a339d76cb71eb6672860a9c1364ae204fdc5dc6 not found: ID does not exist" Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.864436 4703 scope.go:117] "RemoveContainer" containerID="cc87197c1fda9865f81c76cb123bdf355d8fea22b3b3c3c1100a3bc375481b8a" Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.864790 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cc87197c1fda9865f81c76cb123bdf355d8fea22b3b3c3c1100a3bc375481b8a"} err="failed to get container status \"cc87197c1fda9865f81c76cb123bdf355d8fea22b3b3c3c1100a3bc375481b8a\": rpc error: code = NotFound desc = could not find container \"cc87197c1fda9865f81c76cb123bdf355d8fea22b3b3c3c1100a3bc375481b8a\": container with ID starting with cc87197c1fda9865f81c76cb123bdf355d8fea22b3b3c3c1100a3bc375481b8a not found: ID does not exist" Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.864809 4703 scope.go:117] "RemoveContainer" containerID="5acab90dda80ca9a5f26446486eea0689403ad891b85d765af6a439181aa47f2" Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.865155 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5acab90dda80ca9a5f26446486eea0689403ad891b85d765af6a439181aa47f2"} err="failed to get container status \"5acab90dda80ca9a5f26446486eea0689403ad891b85d765af6a439181aa47f2\": rpc error: code = NotFound desc = could not find container \"5acab90dda80ca9a5f26446486eea0689403ad891b85d765af6a439181aa47f2\": container with ID starting with 5acab90dda80ca9a5f26446486eea0689403ad891b85d765af6a439181aa47f2 not found: ID does not exist" Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.865184 4703 scope.go:117] "RemoveContainer" containerID="00e6bcaa58d0cbca2d738a616bb0a5bbf22d4563f2851f11e2c995b08a080789" Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.865603 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"00e6bcaa58d0cbca2d738a616bb0a5bbf22d4563f2851f11e2c995b08a080789"} err="failed to get container status \"00e6bcaa58d0cbca2d738a616bb0a5bbf22d4563f2851f11e2c995b08a080789\": rpc error: code = NotFound desc = could not find container \"00e6bcaa58d0cbca2d738a616bb0a5bbf22d4563f2851f11e2c995b08a080789\": container with ID starting with 00e6bcaa58d0cbca2d738a616bb0a5bbf22d4563f2851f11e2c995b08a080789 not found: ID does not exist" Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.865640 4703 scope.go:117] "RemoveContainer" containerID="295892d24c4de9ffda31b2be4cb00fdf1a108088677abebd05b7ab98c0f722f5" Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.866368 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"295892d24c4de9ffda31b2be4cb00fdf1a108088677abebd05b7ab98c0f722f5"} err="failed to get container status \"295892d24c4de9ffda31b2be4cb00fdf1a108088677abebd05b7ab98c0f722f5\": rpc error: code = NotFound desc = could not find container \"295892d24c4de9ffda31b2be4cb00fdf1a108088677abebd05b7ab98c0f722f5\": container with ID starting with 295892d24c4de9ffda31b2be4cb00fdf1a108088677abebd05b7ab98c0f722f5 not found: ID does not exist" Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.866406 4703 scope.go:117] "RemoveContainer" containerID="de794e2f163f11d0639196abf1633be939f13e7a6925907982886f60f0b6decb" Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.866925 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de794e2f163f11d0639196abf1633be939f13e7a6925907982886f60f0b6decb"} err="failed to get container status \"de794e2f163f11d0639196abf1633be939f13e7a6925907982886f60f0b6decb\": rpc error: code = NotFound desc = could not find container \"de794e2f163f11d0639196abf1633be939f13e7a6925907982886f60f0b6decb\": container with ID starting with de794e2f163f11d0639196abf1633be939f13e7a6925907982886f60f0b6decb not found: ID does not exist" Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.866948 4703 scope.go:117] "RemoveContainer" containerID="7327b4c65a37d4d38fd797a79eada88617bce15e8ca9ea2145f1cd8b2e7a4ea5" Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.867232 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7327b4c65a37d4d38fd797a79eada88617bce15e8ca9ea2145f1cd8b2e7a4ea5"} err="failed to get container status \"7327b4c65a37d4d38fd797a79eada88617bce15e8ca9ea2145f1cd8b2e7a4ea5\": rpc error: code = NotFound desc = could not find container \"7327b4c65a37d4d38fd797a79eada88617bce15e8ca9ea2145f1cd8b2e7a4ea5\": container with ID starting with 7327b4c65a37d4d38fd797a79eada88617bce15e8ca9ea2145f1cd8b2e7a4ea5 not found: ID does not exist" Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.867250 4703 scope.go:117] "RemoveContainer" containerID="2c7f1264e75bed906a827c0eeedcf3120809c53ede363c06629d4219ed82aa42" Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.867516 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2c7f1264e75bed906a827c0eeedcf3120809c53ede363c06629d4219ed82aa42"} err="failed to get container status \"2c7f1264e75bed906a827c0eeedcf3120809c53ede363c06629d4219ed82aa42\": rpc error: code = NotFound desc = could not find container \"2c7f1264e75bed906a827c0eeedcf3120809c53ede363c06629d4219ed82aa42\": container with ID starting with 2c7f1264e75bed906a827c0eeedcf3120809c53ede363c06629d4219ed82aa42 not found: ID does not exist" Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.867533 4703 scope.go:117] "RemoveContainer" containerID="f275d204a8d4c20a7e6b6e7897724b671daa662a53ec87bdef257935fe5fbb07" Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.867707 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f275d204a8d4c20a7e6b6e7897724b671daa662a53ec87bdef257935fe5fbb07"} err="failed to get container status \"f275d204a8d4c20a7e6b6e7897724b671daa662a53ec87bdef257935fe5fbb07\": rpc error: code = NotFound desc = could not find container \"f275d204a8d4c20a7e6b6e7897724b671daa662a53ec87bdef257935fe5fbb07\": container with ID starting with f275d204a8d4c20a7e6b6e7897724b671daa662a53ec87bdef257935fe5fbb07 not found: ID does not exist" Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.867726 4703 scope.go:117] "RemoveContainer" containerID="b95e578049157330f58c94e99a339d76cb71eb6672860a9c1364ae204fdc5dc6" Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.867913 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b95e578049157330f58c94e99a339d76cb71eb6672860a9c1364ae204fdc5dc6"} err="failed to get container status \"b95e578049157330f58c94e99a339d76cb71eb6672860a9c1364ae204fdc5dc6\": rpc error: code = NotFound desc = could not find container \"b95e578049157330f58c94e99a339d76cb71eb6672860a9c1364ae204fdc5dc6\": container with ID starting with b95e578049157330f58c94e99a339d76cb71eb6672860a9c1364ae204fdc5dc6 not found: ID does not exist" Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.867929 4703 scope.go:117] "RemoveContainer" containerID="cc87197c1fda9865f81c76cb123bdf355d8fea22b3b3c3c1100a3bc375481b8a" Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.868096 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cc87197c1fda9865f81c76cb123bdf355d8fea22b3b3c3c1100a3bc375481b8a"} err="failed to get container status \"cc87197c1fda9865f81c76cb123bdf355d8fea22b3b3c3c1100a3bc375481b8a\": rpc error: code = NotFound desc = could not find container \"cc87197c1fda9865f81c76cb123bdf355d8fea22b3b3c3c1100a3bc375481b8a\": container with ID starting with cc87197c1fda9865f81c76cb123bdf355d8fea22b3b3c3c1100a3bc375481b8a not found: ID does not exist" Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.868110 4703 scope.go:117] "RemoveContainer" containerID="5acab90dda80ca9a5f26446486eea0689403ad891b85d765af6a439181aa47f2" Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.868371 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5acab90dda80ca9a5f26446486eea0689403ad891b85d765af6a439181aa47f2"} err="failed to get container status \"5acab90dda80ca9a5f26446486eea0689403ad891b85d765af6a439181aa47f2\": rpc error: code = NotFound desc = could not find container \"5acab90dda80ca9a5f26446486eea0689403ad891b85d765af6a439181aa47f2\": container with ID starting with 5acab90dda80ca9a5f26446486eea0689403ad891b85d765af6a439181aa47f2 not found: ID does not exist" Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.868385 4703 scope.go:117] "RemoveContainer" containerID="00e6bcaa58d0cbca2d738a616bb0a5bbf22d4563f2851f11e2c995b08a080789" Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.868598 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"00e6bcaa58d0cbca2d738a616bb0a5bbf22d4563f2851f11e2c995b08a080789"} err="failed to get container status \"00e6bcaa58d0cbca2d738a616bb0a5bbf22d4563f2851f11e2c995b08a080789\": rpc error: code = NotFound desc = could not find container \"00e6bcaa58d0cbca2d738a616bb0a5bbf22d4563f2851f11e2c995b08a080789\": container with ID starting with 00e6bcaa58d0cbca2d738a616bb0a5bbf22d4563f2851f11e2c995b08a080789 not found: ID does not exist" Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.868613 4703 scope.go:117] "RemoveContainer" containerID="295892d24c4de9ffda31b2be4cb00fdf1a108088677abebd05b7ab98c0f722f5" Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.868845 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"295892d24c4de9ffda31b2be4cb00fdf1a108088677abebd05b7ab98c0f722f5"} err="failed to get container status \"295892d24c4de9ffda31b2be4cb00fdf1a108088677abebd05b7ab98c0f722f5\": rpc error: code = NotFound desc = could not find container \"295892d24c4de9ffda31b2be4cb00fdf1a108088677abebd05b7ab98c0f722f5\": container with ID starting with 295892d24c4de9ffda31b2be4cb00fdf1a108088677abebd05b7ab98c0f722f5 not found: ID does not exist" Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.868870 4703 scope.go:117] "RemoveContainer" containerID="de794e2f163f11d0639196abf1633be939f13e7a6925907982886f60f0b6decb" Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.869115 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de794e2f163f11d0639196abf1633be939f13e7a6925907982886f60f0b6decb"} err="failed to get container status \"de794e2f163f11d0639196abf1633be939f13e7a6925907982886f60f0b6decb\": rpc error: code = NotFound desc = could not find container \"de794e2f163f11d0639196abf1633be939f13e7a6925907982886f60f0b6decb\": container with ID starting with de794e2f163f11d0639196abf1633be939f13e7a6925907982886f60f0b6decb not found: ID does not exist" Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.869131 4703 scope.go:117] "RemoveContainer" containerID="7327b4c65a37d4d38fd797a79eada88617bce15e8ca9ea2145f1cd8b2e7a4ea5" Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.869322 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7327b4c65a37d4d38fd797a79eada88617bce15e8ca9ea2145f1cd8b2e7a4ea5"} err="failed to get container status \"7327b4c65a37d4d38fd797a79eada88617bce15e8ca9ea2145f1cd8b2e7a4ea5\": rpc error: code = NotFound desc = could not find container \"7327b4c65a37d4d38fd797a79eada88617bce15e8ca9ea2145f1cd8b2e7a4ea5\": container with ID starting with 7327b4c65a37d4d38fd797a79eada88617bce15e8ca9ea2145f1cd8b2e7a4ea5 not found: ID does not exist" Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.869353 4703 scope.go:117] "RemoveContainer" containerID="2c7f1264e75bed906a827c0eeedcf3120809c53ede363c06629d4219ed82aa42" Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.869656 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2c7f1264e75bed906a827c0eeedcf3120809c53ede363c06629d4219ed82aa42"} err="failed to get container status \"2c7f1264e75bed906a827c0eeedcf3120809c53ede363c06629d4219ed82aa42\": rpc error: code = NotFound desc = could not find container \"2c7f1264e75bed906a827c0eeedcf3120809c53ede363c06629d4219ed82aa42\": container with ID starting with 2c7f1264e75bed906a827c0eeedcf3120809c53ede363c06629d4219ed82aa42 not found: ID does not exist" Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.869680 4703 scope.go:117] "RemoveContainer" containerID="f275d204a8d4c20a7e6b6e7897724b671daa662a53ec87bdef257935fe5fbb07" Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.870017 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f275d204a8d4c20a7e6b6e7897724b671daa662a53ec87bdef257935fe5fbb07"} err="failed to get container status \"f275d204a8d4c20a7e6b6e7897724b671daa662a53ec87bdef257935fe5fbb07\": rpc error: code = NotFound desc = could not find container \"f275d204a8d4c20a7e6b6e7897724b671daa662a53ec87bdef257935fe5fbb07\": container with ID starting with f275d204a8d4c20a7e6b6e7897724b671daa662a53ec87bdef257935fe5fbb07 not found: ID does not exist" Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.870029 4703 scope.go:117] "RemoveContainer" containerID="b95e578049157330f58c94e99a339d76cb71eb6672860a9c1364ae204fdc5dc6" Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.870250 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b95e578049157330f58c94e99a339d76cb71eb6672860a9c1364ae204fdc5dc6"} err="failed to get container status \"b95e578049157330f58c94e99a339d76cb71eb6672860a9c1364ae204fdc5dc6\": rpc error: code = NotFound desc = could not find container \"b95e578049157330f58c94e99a339d76cb71eb6672860a9c1364ae204fdc5dc6\": container with ID starting with b95e578049157330f58c94e99a339d76cb71eb6672860a9c1364ae204fdc5dc6 not found: ID does not exist" Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.870275 4703 scope.go:117] "RemoveContainer" containerID="cc87197c1fda9865f81c76cb123bdf355d8fea22b3b3c3c1100a3bc375481b8a" Oct 11 04:03:22 crc kubenswrapper[4703]: I1011 04:03:22.870700 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cc87197c1fda9865f81c76cb123bdf355d8fea22b3b3c3c1100a3bc375481b8a"} err="failed to get container status \"cc87197c1fda9865f81c76cb123bdf355d8fea22b3b3c3c1100a3bc375481b8a\": rpc error: code = NotFound desc = could not find container \"cc87197c1fda9865f81c76cb123bdf355d8fea22b3b3c3c1100a3bc375481b8a\": container with ID starting with cc87197c1fda9865f81c76cb123bdf355d8fea22b3b3c3c1100a3bc375481b8a not found: ID does not exist" Oct 11 04:03:23 crc kubenswrapper[4703]: I1011 04:03:23.541036 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8090d9aa-59c5-4c77-a4c0-94f2fa8d4426" path="/var/lib/kubelet/pods/8090d9aa-59c5-4c77-a4c0-94f2fa8d4426/volumes" Oct 11 04:03:23 crc kubenswrapper[4703]: I1011 04:03:23.609283 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lx5cm" event={"ID":"0cab307c-2039-45cc-b2ed-24f3c86eec92","Type":"ContainerStarted","Data":"7ce73ae98621c782c0b44fb2d8303d882c69b61521d70ef596c9b93e195f73b2"} Oct 11 04:03:23 crc kubenswrapper[4703]: I1011 04:03:23.609322 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lx5cm" event={"ID":"0cab307c-2039-45cc-b2ed-24f3c86eec92","Type":"ContainerStarted","Data":"ffc874993937860b0835b80c92c937fa2f03d6c256e3b577ec1d8b2de6d6340e"} Oct 11 04:03:23 crc kubenswrapper[4703]: I1011 04:03:23.609332 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lx5cm" event={"ID":"0cab307c-2039-45cc-b2ed-24f3c86eec92","Type":"ContainerStarted","Data":"300ac1fe58f76418f740e8144ea138bf8f5e78569201adcb78449d1e1d5a8002"} Oct 11 04:03:23 crc kubenswrapper[4703]: I1011 04:03:23.609340 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lx5cm" event={"ID":"0cab307c-2039-45cc-b2ed-24f3c86eec92","Type":"ContainerStarted","Data":"42590d27490bff0c3c75a687ce2ab7ce9bdc1f3a10a198fce9b784d9926f9087"} Oct 11 04:03:23 crc kubenswrapper[4703]: I1011 04:03:23.609348 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lx5cm" event={"ID":"0cab307c-2039-45cc-b2ed-24f3c86eec92","Type":"ContainerStarted","Data":"91820cb1b62eb1ac00d1a8f1e514fe5740f9cd7af3010f7f572eb7a325b87b09"} Oct 11 04:03:23 crc kubenswrapper[4703]: I1011 04:03:23.609357 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lx5cm" event={"ID":"0cab307c-2039-45cc-b2ed-24f3c86eec92","Type":"ContainerStarted","Data":"8c16eee58189a760e844a03188cf19cd0c14d4378e31c0a4dd0e370e26155386"} Oct 11 04:03:23 crc kubenswrapper[4703]: I1011 04:03:23.611681 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-vgqng_6b6de354-b085-4f66-ac6c-4eb6005aa965/kube-multus/0.log" Oct 11 04:03:23 crc kubenswrapper[4703]: I1011 04:03:23.611712 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-vgqng" event={"ID":"6b6de354-b085-4f66-ac6c-4eb6005aa965","Type":"ContainerStarted","Data":"9cd7c906043a8ff05c9343df17ff588a14e0b9134c71c21d6c24ba349d9bba61"} Oct 11 04:03:26 crc kubenswrapper[4703]: I1011 04:03:26.658040 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lx5cm" event={"ID":"0cab307c-2039-45cc-b2ed-24f3c86eec92","Type":"ContainerStarted","Data":"cbd4cb415e9fe6b008a1725393812636eca4b7f10372e2278082196fbdcc099a"} Oct 11 04:03:28 crc kubenswrapper[4703]: I1011 04:03:28.673682 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lx5cm" event={"ID":"0cab307c-2039-45cc-b2ed-24f3c86eec92","Type":"ContainerStarted","Data":"59bc80ac269e0dac1ff49f37f09b1b2f22f133e4ca1b2620db41ec74e4f915fa"} Oct 11 04:03:28 crc kubenswrapper[4703]: I1011 04:03:28.674192 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-lx5cm" Oct 11 04:03:28 crc kubenswrapper[4703]: I1011 04:03:28.674312 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-lx5cm" Oct 11 04:03:28 crc kubenswrapper[4703]: I1011 04:03:28.674336 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-lx5cm" Oct 11 04:03:28 crc kubenswrapper[4703]: I1011 04:03:28.708111 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-lx5cm" Oct 11 04:03:28 crc kubenswrapper[4703]: I1011 04:03:28.713719 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-lx5cm" podStartSLOduration=6.713701114 podStartE2EDuration="6.713701114s" podCreationTimestamp="2025-10-11 04:03:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 04:03:28.71206041 +0000 UTC m=+499.922542342" watchObservedRunningTime="2025-10-11 04:03:28.713701114 +0000 UTC m=+499.924183046" Oct 11 04:03:28 crc kubenswrapper[4703]: I1011 04:03:28.714514 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-lx5cm" Oct 11 04:03:48 crc kubenswrapper[4703]: I1011 04:03:48.289443 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2cdnlp"] Oct 11 04:03:48 crc kubenswrapper[4703]: I1011 04:03:48.291382 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2cdnlp" Oct 11 04:03:48 crc kubenswrapper[4703]: I1011 04:03:48.294275 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Oct 11 04:03:48 crc kubenswrapper[4703]: I1011 04:03:48.307456 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2cdnlp"] Oct 11 04:03:48 crc kubenswrapper[4703]: I1011 04:03:48.383897 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/188f1812-b442-4b58-a5f9-4251a18bea8c-util\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2cdnlp\" (UID: \"188f1812-b442-4b58-a5f9-4251a18bea8c\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2cdnlp" Oct 11 04:03:48 crc kubenswrapper[4703]: I1011 04:03:48.383950 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/188f1812-b442-4b58-a5f9-4251a18bea8c-bundle\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2cdnlp\" (UID: \"188f1812-b442-4b58-a5f9-4251a18bea8c\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2cdnlp" Oct 11 04:03:48 crc kubenswrapper[4703]: I1011 04:03:48.383996 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fggg2\" (UniqueName: \"kubernetes.io/projected/188f1812-b442-4b58-a5f9-4251a18bea8c-kube-api-access-fggg2\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2cdnlp\" (UID: \"188f1812-b442-4b58-a5f9-4251a18bea8c\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2cdnlp" Oct 11 04:03:48 crc kubenswrapper[4703]: I1011 04:03:48.484904 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/188f1812-b442-4b58-a5f9-4251a18bea8c-bundle\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2cdnlp\" (UID: \"188f1812-b442-4b58-a5f9-4251a18bea8c\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2cdnlp" Oct 11 04:03:48 crc kubenswrapper[4703]: I1011 04:03:48.485013 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fggg2\" (UniqueName: \"kubernetes.io/projected/188f1812-b442-4b58-a5f9-4251a18bea8c-kube-api-access-fggg2\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2cdnlp\" (UID: \"188f1812-b442-4b58-a5f9-4251a18bea8c\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2cdnlp" Oct 11 04:03:48 crc kubenswrapper[4703]: I1011 04:03:48.485134 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/188f1812-b442-4b58-a5f9-4251a18bea8c-util\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2cdnlp\" (UID: \"188f1812-b442-4b58-a5f9-4251a18bea8c\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2cdnlp" Oct 11 04:03:48 crc kubenswrapper[4703]: I1011 04:03:48.485869 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/188f1812-b442-4b58-a5f9-4251a18bea8c-bundle\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2cdnlp\" (UID: \"188f1812-b442-4b58-a5f9-4251a18bea8c\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2cdnlp" Oct 11 04:03:48 crc kubenswrapper[4703]: I1011 04:03:48.485949 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/188f1812-b442-4b58-a5f9-4251a18bea8c-util\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2cdnlp\" (UID: \"188f1812-b442-4b58-a5f9-4251a18bea8c\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2cdnlp" Oct 11 04:03:48 crc kubenswrapper[4703]: I1011 04:03:48.518018 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fggg2\" (UniqueName: \"kubernetes.io/projected/188f1812-b442-4b58-a5f9-4251a18bea8c-kube-api-access-fggg2\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2cdnlp\" (UID: \"188f1812-b442-4b58-a5f9-4251a18bea8c\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2cdnlp" Oct 11 04:03:48 crc kubenswrapper[4703]: I1011 04:03:48.620126 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2cdnlp" Oct 11 04:03:48 crc kubenswrapper[4703]: I1011 04:03:48.897168 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2cdnlp"] Oct 11 04:03:48 crc kubenswrapper[4703]: W1011 04:03:48.922611 4703 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod188f1812_b442_4b58_a5f9_4251a18bea8c.slice/crio-fbe5ee1e3e4db9fcf55ad7b2855362836f985de76b3406b85846eba6535cd9ac WatchSource:0}: Error finding container fbe5ee1e3e4db9fcf55ad7b2855362836f985de76b3406b85846eba6535cd9ac: Status 404 returned error can't find the container with id fbe5ee1e3e4db9fcf55ad7b2855362836f985de76b3406b85846eba6535cd9ac Oct 11 04:03:49 crc kubenswrapper[4703]: I1011 04:03:49.811606 4703 generic.go:334] "Generic (PLEG): container finished" podID="188f1812-b442-4b58-a5f9-4251a18bea8c" containerID="b81128e8ab01cc0bff284b76654997d90e194b33eeb603bf0e91cccd64d3f29c" exitCode=0 Oct 11 04:03:49 crc kubenswrapper[4703]: I1011 04:03:49.811675 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2cdnlp" event={"ID":"188f1812-b442-4b58-a5f9-4251a18bea8c","Type":"ContainerDied","Data":"b81128e8ab01cc0bff284b76654997d90e194b33eeb603bf0e91cccd64d3f29c"} Oct 11 04:03:49 crc kubenswrapper[4703]: I1011 04:03:49.811754 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2cdnlp" event={"ID":"188f1812-b442-4b58-a5f9-4251a18bea8c","Type":"ContainerStarted","Data":"fbe5ee1e3e4db9fcf55ad7b2855362836f985de76b3406b85846eba6535cd9ac"} Oct 11 04:03:49 crc kubenswrapper[4703]: I1011 04:03:49.814871 4703 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 11 04:03:50 crc kubenswrapper[4703]: I1011 04:03:50.255756 4703 patch_prober.go:28] interesting pod/machine-config-daemon-6b7d5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 11 04:03:50 crc kubenswrapper[4703]: I1011 04:03:50.255916 4703 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6b7d5" podUID="74f832fc-6791-47d6-a9b3-07d923e053dc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 11 04:03:51 crc kubenswrapper[4703]: I1011 04:03:51.830633 4703 generic.go:334] "Generic (PLEG): container finished" podID="188f1812-b442-4b58-a5f9-4251a18bea8c" containerID="03f03569a6eec48451ee55fdbbaa786f5d534b03f2172ffb9a5dc144af3a535d" exitCode=0 Oct 11 04:03:51 crc kubenswrapper[4703]: I1011 04:03:51.830841 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2cdnlp" event={"ID":"188f1812-b442-4b58-a5f9-4251a18bea8c","Type":"ContainerDied","Data":"03f03569a6eec48451ee55fdbbaa786f5d534b03f2172ffb9a5dc144af3a535d"} Oct 11 04:03:52 crc kubenswrapper[4703]: I1011 04:03:52.401989 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-lx5cm" Oct 11 04:03:52 crc kubenswrapper[4703]: I1011 04:03:52.841761 4703 generic.go:334] "Generic (PLEG): container finished" podID="188f1812-b442-4b58-a5f9-4251a18bea8c" containerID="0803ab79d405cd84ef208b2ada91aa35d10166ab5feafc42cc7e5a05668d8cdb" exitCode=0 Oct 11 04:03:52 crc kubenswrapper[4703]: I1011 04:03:52.841838 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2cdnlp" event={"ID":"188f1812-b442-4b58-a5f9-4251a18bea8c","Type":"ContainerDied","Data":"0803ab79d405cd84ef208b2ada91aa35d10166ab5feafc42cc7e5a05668d8cdb"} Oct 11 04:03:54 crc kubenswrapper[4703]: I1011 04:03:54.110776 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2cdnlp" Oct 11 04:03:54 crc kubenswrapper[4703]: I1011 04:03:54.172622 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/188f1812-b442-4b58-a5f9-4251a18bea8c-bundle\") pod \"188f1812-b442-4b58-a5f9-4251a18bea8c\" (UID: \"188f1812-b442-4b58-a5f9-4251a18bea8c\") " Oct 11 04:03:54 crc kubenswrapper[4703]: I1011 04:03:54.172734 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/188f1812-b442-4b58-a5f9-4251a18bea8c-util\") pod \"188f1812-b442-4b58-a5f9-4251a18bea8c\" (UID: \"188f1812-b442-4b58-a5f9-4251a18bea8c\") " Oct 11 04:03:54 crc kubenswrapper[4703]: I1011 04:03:54.172823 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fggg2\" (UniqueName: \"kubernetes.io/projected/188f1812-b442-4b58-a5f9-4251a18bea8c-kube-api-access-fggg2\") pod \"188f1812-b442-4b58-a5f9-4251a18bea8c\" (UID: \"188f1812-b442-4b58-a5f9-4251a18bea8c\") " Oct 11 04:03:54 crc kubenswrapper[4703]: I1011 04:03:54.173651 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/188f1812-b442-4b58-a5f9-4251a18bea8c-bundle" (OuterVolumeSpecName: "bundle") pod "188f1812-b442-4b58-a5f9-4251a18bea8c" (UID: "188f1812-b442-4b58-a5f9-4251a18bea8c"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 04:03:54 crc kubenswrapper[4703]: I1011 04:03:54.178360 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/188f1812-b442-4b58-a5f9-4251a18bea8c-kube-api-access-fggg2" (OuterVolumeSpecName: "kube-api-access-fggg2") pod "188f1812-b442-4b58-a5f9-4251a18bea8c" (UID: "188f1812-b442-4b58-a5f9-4251a18bea8c"). InnerVolumeSpecName "kube-api-access-fggg2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 04:03:54 crc kubenswrapper[4703]: I1011 04:03:54.275552 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fggg2\" (UniqueName: \"kubernetes.io/projected/188f1812-b442-4b58-a5f9-4251a18bea8c-kube-api-access-fggg2\") on node \"crc\" DevicePath \"\"" Oct 11 04:03:54 crc kubenswrapper[4703]: I1011 04:03:54.275606 4703 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/188f1812-b442-4b58-a5f9-4251a18bea8c-bundle\") on node \"crc\" DevicePath \"\"" Oct 11 04:03:54 crc kubenswrapper[4703]: I1011 04:03:54.491075 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/188f1812-b442-4b58-a5f9-4251a18bea8c-util" (OuterVolumeSpecName: "util") pod "188f1812-b442-4b58-a5f9-4251a18bea8c" (UID: "188f1812-b442-4b58-a5f9-4251a18bea8c"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 04:03:54 crc kubenswrapper[4703]: I1011 04:03:54.580189 4703 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/188f1812-b442-4b58-a5f9-4251a18bea8c-util\") on node \"crc\" DevicePath \"\"" Oct 11 04:03:54 crc kubenswrapper[4703]: I1011 04:03:54.858180 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2cdnlp" event={"ID":"188f1812-b442-4b58-a5f9-4251a18bea8c","Type":"ContainerDied","Data":"fbe5ee1e3e4db9fcf55ad7b2855362836f985de76b3406b85846eba6535cd9ac"} Oct 11 04:03:54 crc kubenswrapper[4703]: I1011 04:03:54.858239 4703 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fbe5ee1e3e4db9fcf55ad7b2855362836f985de76b3406b85846eba6535cd9ac" Oct 11 04:03:54 crc kubenswrapper[4703]: I1011 04:03:54.858250 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2cdnlp" Oct 11 04:04:02 crc kubenswrapper[4703]: I1011 04:04:02.517957 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-6b46999587-p89xp"] Oct 11 04:04:02 crc kubenswrapper[4703]: E1011 04:04:02.518517 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="188f1812-b442-4b58-a5f9-4251a18bea8c" containerName="pull" Oct 11 04:04:02 crc kubenswrapper[4703]: I1011 04:04:02.518528 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="188f1812-b442-4b58-a5f9-4251a18bea8c" containerName="pull" Oct 11 04:04:02 crc kubenswrapper[4703]: E1011 04:04:02.518541 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="188f1812-b442-4b58-a5f9-4251a18bea8c" containerName="util" Oct 11 04:04:02 crc kubenswrapper[4703]: I1011 04:04:02.518563 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="188f1812-b442-4b58-a5f9-4251a18bea8c" containerName="util" Oct 11 04:04:02 crc kubenswrapper[4703]: E1011 04:04:02.518572 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="188f1812-b442-4b58-a5f9-4251a18bea8c" containerName="extract" Oct 11 04:04:02 crc kubenswrapper[4703]: I1011 04:04:02.518577 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="188f1812-b442-4b58-a5f9-4251a18bea8c" containerName="extract" Oct 11 04:04:02 crc kubenswrapper[4703]: I1011 04:04:02.518689 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="188f1812-b442-4b58-a5f9-4251a18bea8c" containerName="extract" Oct 11 04:04:02 crc kubenswrapper[4703]: I1011 04:04:02.519108 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-6b46999587-p89xp" Oct 11 04:04:02 crc kubenswrapper[4703]: I1011 04:04:02.520959 4703 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Oct 11 04:04:02 crc kubenswrapper[4703]: I1011 04:04:02.521096 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Oct 11 04:04:02 crc kubenswrapper[4703]: I1011 04:04:02.523291 4703 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Oct 11 04:04:02 crc kubenswrapper[4703]: I1011 04:04:02.523784 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Oct 11 04:04:02 crc kubenswrapper[4703]: I1011 04:04:02.531860 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-6b46999587-p89xp"] Oct 11 04:04:02 crc kubenswrapper[4703]: I1011 04:04:02.535030 4703 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-k74kn" Oct 11 04:04:02 crc kubenswrapper[4703]: I1011 04:04:02.582706 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wpjvs\" (UniqueName: \"kubernetes.io/projected/35cd67ee-b670-40f8-a7d5-7034e560930a-kube-api-access-wpjvs\") pod \"metallb-operator-controller-manager-6b46999587-p89xp\" (UID: \"35cd67ee-b670-40f8-a7d5-7034e560930a\") " pod="metallb-system/metallb-operator-controller-manager-6b46999587-p89xp" Oct 11 04:04:02 crc kubenswrapper[4703]: I1011 04:04:02.582780 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/35cd67ee-b670-40f8-a7d5-7034e560930a-apiservice-cert\") pod \"metallb-operator-controller-manager-6b46999587-p89xp\" (UID: \"35cd67ee-b670-40f8-a7d5-7034e560930a\") " pod="metallb-system/metallb-operator-controller-manager-6b46999587-p89xp" Oct 11 04:04:02 crc kubenswrapper[4703]: I1011 04:04:02.582843 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/35cd67ee-b670-40f8-a7d5-7034e560930a-webhook-cert\") pod \"metallb-operator-controller-manager-6b46999587-p89xp\" (UID: \"35cd67ee-b670-40f8-a7d5-7034e560930a\") " pod="metallb-system/metallb-operator-controller-manager-6b46999587-p89xp" Oct 11 04:04:02 crc kubenswrapper[4703]: I1011 04:04:02.684326 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/35cd67ee-b670-40f8-a7d5-7034e560930a-apiservice-cert\") pod \"metallb-operator-controller-manager-6b46999587-p89xp\" (UID: \"35cd67ee-b670-40f8-a7d5-7034e560930a\") " pod="metallb-system/metallb-operator-controller-manager-6b46999587-p89xp" Oct 11 04:04:02 crc kubenswrapper[4703]: I1011 04:04:02.684390 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/35cd67ee-b670-40f8-a7d5-7034e560930a-webhook-cert\") pod \"metallb-operator-controller-manager-6b46999587-p89xp\" (UID: \"35cd67ee-b670-40f8-a7d5-7034e560930a\") " pod="metallb-system/metallb-operator-controller-manager-6b46999587-p89xp" Oct 11 04:04:02 crc kubenswrapper[4703]: I1011 04:04:02.684421 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wpjvs\" (UniqueName: \"kubernetes.io/projected/35cd67ee-b670-40f8-a7d5-7034e560930a-kube-api-access-wpjvs\") pod \"metallb-operator-controller-manager-6b46999587-p89xp\" (UID: \"35cd67ee-b670-40f8-a7d5-7034e560930a\") " pod="metallb-system/metallb-operator-controller-manager-6b46999587-p89xp" Oct 11 04:04:02 crc kubenswrapper[4703]: I1011 04:04:02.690141 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/35cd67ee-b670-40f8-a7d5-7034e560930a-webhook-cert\") pod \"metallb-operator-controller-manager-6b46999587-p89xp\" (UID: \"35cd67ee-b670-40f8-a7d5-7034e560930a\") " pod="metallb-system/metallb-operator-controller-manager-6b46999587-p89xp" Oct 11 04:04:02 crc kubenswrapper[4703]: I1011 04:04:02.705152 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/35cd67ee-b670-40f8-a7d5-7034e560930a-apiservice-cert\") pod \"metallb-operator-controller-manager-6b46999587-p89xp\" (UID: \"35cd67ee-b670-40f8-a7d5-7034e560930a\") " pod="metallb-system/metallb-operator-controller-manager-6b46999587-p89xp" Oct 11 04:04:02 crc kubenswrapper[4703]: I1011 04:04:02.705184 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wpjvs\" (UniqueName: \"kubernetes.io/projected/35cd67ee-b670-40f8-a7d5-7034e560930a-kube-api-access-wpjvs\") pod \"metallb-operator-controller-manager-6b46999587-p89xp\" (UID: \"35cd67ee-b670-40f8-a7d5-7034e560930a\") " pod="metallb-system/metallb-operator-controller-manager-6b46999587-p89xp" Oct 11 04:04:02 crc kubenswrapper[4703]: I1011 04:04:02.776828 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-79b78bfd4c-brzbh"] Oct 11 04:04:02 crc kubenswrapper[4703]: I1011 04:04:02.777479 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-79b78bfd4c-brzbh" Oct 11 04:04:02 crc kubenswrapper[4703]: I1011 04:04:02.779336 4703 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Oct 11 04:04:02 crc kubenswrapper[4703]: I1011 04:04:02.779503 4703 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Oct 11 04:04:02 crc kubenswrapper[4703]: I1011 04:04:02.779575 4703 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-vmbdn" Oct 11 04:04:02 crc kubenswrapper[4703]: I1011 04:04:02.801873 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-79b78bfd4c-brzbh"] Oct 11 04:04:02 crc kubenswrapper[4703]: I1011 04:04:02.833595 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-6b46999587-p89xp" Oct 11 04:04:02 crc kubenswrapper[4703]: I1011 04:04:02.886174 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j2kst\" (UniqueName: \"kubernetes.io/projected/40d928a1-8704-4635-a95b-930bac6ac447-kube-api-access-j2kst\") pod \"metallb-operator-webhook-server-79b78bfd4c-brzbh\" (UID: \"40d928a1-8704-4635-a95b-930bac6ac447\") " pod="metallb-system/metallb-operator-webhook-server-79b78bfd4c-brzbh" Oct 11 04:04:02 crc kubenswrapper[4703]: I1011 04:04:02.886401 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/40d928a1-8704-4635-a95b-930bac6ac447-webhook-cert\") pod \"metallb-operator-webhook-server-79b78bfd4c-brzbh\" (UID: \"40d928a1-8704-4635-a95b-930bac6ac447\") " pod="metallb-system/metallb-operator-webhook-server-79b78bfd4c-brzbh" Oct 11 04:04:02 crc kubenswrapper[4703]: I1011 04:04:02.886486 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/40d928a1-8704-4635-a95b-930bac6ac447-apiservice-cert\") pod \"metallb-operator-webhook-server-79b78bfd4c-brzbh\" (UID: \"40d928a1-8704-4635-a95b-930bac6ac447\") " pod="metallb-system/metallb-operator-webhook-server-79b78bfd4c-brzbh" Oct 11 04:04:02 crc kubenswrapper[4703]: I1011 04:04:02.987649 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j2kst\" (UniqueName: \"kubernetes.io/projected/40d928a1-8704-4635-a95b-930bac6ac447-kube-api-access-j2kst\") pod \"metallb-operator-webhook-server-79b78bfd4c-brzbh\" (UID: \"40d928a1-8704-4635-a95b-930bac6ac447\") " pod="metallb-system/metallb-operator-webhook-server-79b78bfd4c-brzbh" Oct 11 04:04:02 crc kubenswrapper[4703]: I1011 04:04:02.987750 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/40d928a1-8704-4635-a95b-930bac6ac447-webhook-cert\") pod \"metallb-operator-webhook-server-79b78bfd4c-brzbh\" (UID: \"40d928a1-8704-4635-a95b-930bac6ac447\") " pod="metallb-system/metallb-operator-webhook-server-79b78bfd4c-brzbh" Oct 11 04:04:02 crc kubenswrapper[4703]: I1011 04:04:02.987782 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/40d928a1-8704-4635-a95b-930bac6ac447-apiservice-cert\") pod \"metallb-operator-webhook-server-79b78bfd4c-brzbh\" (UID: \"40d928a1-8704-4635-a95b-930bac6ac447\") " pod="metallb-system/metallb-operator-webhook-server-79b78bfd4c-brzbh" Oct 11 04:04:02 crc kubenswrapper[4703]: I1011 04:04:02.994384 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/40d928a1-8704-4635-a95b-930bac6ac447-apiservice-cert\") pod \"metallb-operator-webhook-server-79b78bfd4c-brzbh\" (UID: \"40d928a1-8704-4635-a95b-930bac6ac447\") " pod="metallb-system/metallb-operator-webhook-server-79b78bfd4c-brzbh" Oct 11 04:04:02 crc kubenswrapper[4703]: I1011 04:04:02.996097 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/40d928a1-8704-4635-a95b-930bac6ac447-webhook-cert\") pod \"metallb-operator-webhook-server-79b78bfd4c-brzbh\" (UID: \"40d928a1-8704-4635-a95b-930bac6ac447\") " pod="metallb-system/metallb-operator-webhook-server-79b78bfd4c-brzbh" Oct 11 04:04:03 crc kubenswrapper[4703]: I1011 04:04:03.014193 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j2kst\" (UniqueName: \"kubernetes.io/projected/40d928a1-8704-4635-a95b-930bac6ac447-kube-api-access-j2kst\") pod \"metallb-operator-webhook-server-79b78bfd4c-brzbh\" (UID: \"40d928a1-8704-4635-a95b-930bac6ac447\") " pod="metallb-system/metallb-operator-webhook-server-79b78bfd4c-brzbh" Oct 11 04:04:03 crc kubenswrapper[4703]: I1011 04:04:03.051222 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-6b46999587-p89xp"] Oct 11 04:04:03 crc kubenswrapper[4703]: I1011 04:04:03.091196 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-79b78bfd4c-brzbh" Oct 11 04:04:03 crc kubenswrapper[4703]: I1011 04:04:03.269369 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-79b78bfd4c-brzbh"] Oct 11 04:04:03 crc kubenswrapper[4703]: W1011 04:04:03.455297 4703 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod40d928a1_8704_4635_a95b_930bac6ac447.slice/crio-2a3086456a11179ca352da19c46d9a78a44fd521b2bc4326387942178de6f09e WatchSource:0}: Error finding container 2a3086456a11179ca352da19c46d9a78a44fd521b2bc4326387942178de6f09e: Status 404 returned error can't find the container with id 2a3086456a11179ca352da19c46d9a78a44fd521b2bc4326387942178de6f09e Oct 11 04:04:03 crc kubenswrapper[4703]: I1011 04:04:03.904534 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-6b46999587-p89xp" event={"ID":"35cd67ee-b670-40f8-a7d5-7034e560930a","Type":"ContainerStarted","Data":"94c44a29585007f24b043e12ca195dbb20efb596a8131b5fd08c6e99410dfeb7"} Oct 11 04:04:03 crc kubenswrapper[4703]: I1011 04:04:03.905778 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-79b78bfd4c-brzbh" event={"ID":"40d928a1-8704-4635-a95b-930bac6ac447","Type":"ContainerStarted","Data":"2a3086456a11179ca352da19c46d9a78a44fd521b2bc4326387942178de6f09e"} Oct 11 04:04:10 crc kubenswrapper[4703]: I1011 04:04:10.943301 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-79b78bfd4c-brzbh" event={"ID":"40d928a1-8704-4635-a95b-930bac6ac447","Type":"ContainerStarted","Data":"8edc9788b403ce15fd67893faf640f434074de729a3232fe1cc1081800c9e44f"} Oct 11 04:04:10 crc kubenswrapper[4703]: I1011 04:04:10.945432 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-79b78bfd4c-brzbh" Oct 11 04:04:10 crc kubenswrapper[4703]: I1011 04:04:10.945998 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-6b46999587-p89xp" event={"ID":"35cd67ee-b670-40f8-a7d5-7034e560930a","Type":"ContainerStarted","Data":"43195b327e3921ca2719157c372885a33bc637610cd24b88baa30da12df71b71"} Oct 11 04:04:10 crc kubenswrapper[4703]: I1011 04:04:10.946217 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-6b46999587-p89xp" Oct 11 04:04:10 crc kubenswrapper[4703]: I1011 04:04:10.968879 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-79b78bfd4c-brzbh" podStartSLOduration=2.467138381 podStartE2EDuration="8.968855062s" podCreationTimestamp="2025-10-11 04:04:02 +0000 UTC" firstStartedPulling="2025-10-11 04:04:03.458228888 +0000 UTC m=+534.668710830" lastFinishedPulling="2025-10-11 04:04:09.959945579 +0000 UTC m=+541.170427511" observedRunningTime="2025-10-11 04:04:10.966778326 +0000 UTC m=+542.177260288" watchObservedRunningTime="2025-10-11 04:04:10.968855062 +0000 UTC m=+542.179336984" Oct 11 04:04:10 crc kubenswrapper[4703]: I1011 04:04:10.992926 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-6b46999587-p89xp" podStartSLOduration=2.106768911 podStartE2EDuration="8.992902944s" podCreationTimestamp="2025-10-11 04:04:02 +0000 UTC" firstStartedPulling="2025-10-11 04:04:03.067476327 +0000 UTC m=+534.277958249" lastFinishedPulling="2025-10-11 04:04:09.95361035 +0000 UTC m=+541.164092282" observedRunningTime="2025-10-11 04:04:10.991020914 +0000 UTC m=+542.201502876" watchObservedRunningTime="2025-10-11 04:04:10.992902944 +0000 UTC m=+542.203384906" Oct 11 04:04:20 crc kubenswrapper[4703]: I1011 04:04:20.254734 4703 patch_prober.go:28] interesting pod/machine-config-daemon-6b7d5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 11 04:04:20 crc kubenswrapper[4703]: I1011 04:04:20.255258 4703 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6b7d5" podUID="74f832fc-6791-47d6-a9b3-07d923e053dc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 11 04:04:23 crc kubenswrapper[4703]: I1011 04:04:23.097840 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-79b78bfd4c-brzbh" Oct 11 04:04:42 crc kubenswrapper[4703]: I1011 04:04:42.838595 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-6b46999587-p89xp" Oct 11 04:04:43 crc kubenswrapper[4703]: I1011 04:04:43.600777 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-64bf5d555-sdbdc"] Oct 11 04:04:43 crc kubenswrapper[4703]: I1011 04:04:43.601696 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-sdbdc" Oct 11 04:04:43 crc kubenswrapper[4703]: I1011 04:04:43.606041 4703 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-qwx8q" Oct 11 04:04:43 crc kubenswrapper[4703]: I1011 04:04:43.606151 4703 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Oct 11 04:04:43 crc kubenswrapper[4703]: I1011 04:04:43.607197 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7fabba07-23a6-4cb6-9e64-7b1ff55e3852-cert\") pod \"frr-k8s-webhook-server-64bf5d555-sdbdc\" (UID: \"7fabba07-23a6-4cb6-9e64-7b1ff55e3852\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-sdbdc" Oct 11 04:04:43 crc kubenswrapper[4703]: I1011 04:04:43.607523 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-959f2\" (UniqueName: \"kubernetes.io/projected/7fabba07-23a6-4cb6-9e64-7b1ff55e3852-kube-api-access-959f2\") pod \"frr-k8s-webhook-server-64bf5d555-sdbdc\" (UID: \"7fabba07-23a6-4cb6-9e64-7b1ff55e3852\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-sdbdc" Oct 11 04:04:43 crc kubenswrapper[4703]: I1011 04:04:43.612543 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-n642b"] Oct 11 04:04:43 crc kubenswrapper[4703]: I1011 04:04:43.614495 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-n642b" Oct 11 04:04:43 crc kubenswrapper[4703]: I1011 04:04:43.617264 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Oct 11 04:04:43 crc kubenswrapper[4703]: I1011 04:04:43.617419 4703 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Oct 11 04:04:43 crc kubenswrapper[4703]: I1011 04:04:43.634997 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-64bf5d555-sdbdc"] Oct 11 04:04:43 crc kubenswrapper[4703]: I1011 04:04:43.680777 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-n8sml"] Oct 11 04:04:43 crc kubenswrapper[4703]: I1011 04:04:43.681544 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-n8sml" Oct 11 04:04:43 crc kubenswrapper[4703]: I1011 04:04:43.692320 4703 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Oct 11 04:04:43 crc kubenswrapper[4703]: I1011 04:04:43.692970 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Oct 11 04:04:43 crc kubenswrapper[4703]: I1011 04:04:43.693212 4703 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Oct 11 04:04:43 crc kubenswrapper[4703]: I1011 04:04:43.693388 4703 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-hwjj4" Oct 11 04:04:43 crc kubenswrapper[4703]: I1011 04:04:43.711061 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/5b7b0197-921b-4b00-b918-782fad0911ce-reloader\") pod \"frr-k8s-n642b\" (UID: \"5b7b0197-921b-4b00-b918-782fad0911ce\") " pod="metallb-system/frr-k8s-n642b" Oct 11 04:04:43 crc kubenswrapper[4703]: I1011 04:04:43.711910 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7fabba07-23a6-4cb6-9e64-7b1ff55e3852-cert\") pod \"frr-k8s-webhook-server-64bf5d555-sdbdc\" (UID: \"7fabba07-23a6-4cb6-9e64-7b1ff55e3852\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-sdbdc" Oct 11 04:04:43 crc kubenswrapper[4703]: I1011 04:04:43.711938 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/5b7b0197-921b-4b00-b918-782fad0911ce-frr-startup\") pod \"frr-k8s-n642b\" (UID: \"5b7b0197-921b-4b00-b918-782fad0911ce\") " pod="metallb-system/frr-k8s-n642b" Oct 11 04:04:43 crc kubenswrapper[4703]: I1011 04:04:43.711964 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5ee9d2ec-7231-499f-818f-135260d80201-metrics-certs\") pod \"speaker-n8sml\" (UID: \"5ee9d2ec-7231-499f-818f-135260d80201\") " pod="metallb-system/speaker-n8sml" Oct 11 04:04:43 crc kubenswrapper[4703]: I1011 04:04:43.711993 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/5b7b0197-921b-4b00-b918-782fad0911ce-metrics\") pod \"frr-k8s-n642b\" (UID: \"5b7b0197-921b-4b00-b918-782fad0911ce\") " pod="metallb-system/frr-k8s-n642b" Oct 11 04:04:43 crc kubenswrapper[4703]: I1011 04:04:43.712013 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/5ee9d2ec-7231-499f-818f-135260d80201-memberlist\") pod \"speaker-n8sml\" (UID: \"5ee9d2ec-7231-499f-818f-135260d80201\") " pod="metallb-system/speaker-n8sml" Oct 11 04:04:43 crc kubenswrapper[4703]: I1011 04:04:43.712057 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wdvtm\" (UniqueName: \"kubernetes.io/projected/5ee9d2ec-7231-499f-818f-135260d80201-kube-api-access-wdvtm\") pod \"speaker-n8sml\" (UID: \"5ee9d2ec-7231-499f-818f-135260d80201\") " pod="metallb-system/speaker-n8sml" Oct 11 04:04:43 crc kubenswrapper[4703]: I1011 04:04:43.712101 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/5ee9d2ec-7231-499f-818f-135260d80201-metallb-excludel2\") pod \"speaker-n8sml\" (UID: \"5ee9d2ec-7231-499f-818f-135260d80201\") " pod="metallb-system/speaker-n8sml" Oct 11 04:04:43 crc kubenswrapper[4703]: I1011 04:04:43.712133 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-959f2\" (UniqueName: \"kubernetes.io/projected/7fabba07-23a6-4cb6-9e64-7b1ff55e3852-kube-api-access-959f2\") pod \"frr-k8s-webhook-server-64bf5d555-sdbdc\" (UID: \"7fabba07-23a6-4cb6-9e64-7b1ff55e3852\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-sdbdc" Oct 11 04:04:43 crc kubenswrapper[4703]: I1011 04:04:43.712161 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c9k6q\" (UniqueName: \"kubernetes.io/projected/5b7b0197-921b-4b00-b918-782fad0911ce-kube-api-access-c9k6q\") pod \"frr-k8s-n642b\" (UID: \"5b7b0197-921b-4b00-b918-782fad0911ce\") " pod="metallb-system/frr-k8s-n642b" Oct 11 04:04:43 crc kubenswrapper[4703]: I1011 04:04:43.712202 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/5b7b0197-921b-4b00-b918-782fad0911ce-frr-conf\") pod \"frr-k8s-n642b\" (UID: \"5b7b0197-921b-4b00-b918-782fad0911ce\") " pod="metallb-system/frr-k8s-n642b" Oct 11 04:04:43 crc kubenswrapper[4703]: I1011 04:04:43.712229 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b7b0197-921b-4b00-b918-782fad0911ce-metrics-certs\") pod \"frr-k8s-n642b\" (UID: \"5b7b0197-921b-4b00-b918-782fad0911ce\") " pod="metallb-system/frr-k8s-n642b" Oct 11 04:04:43 crc kubenswrapper[4703]: I1011 04:04:43.712265 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/5b7b0197-921b-4b00-b918-782fad0911ce-frr-sockets\") pod \"frr-k8s-n642b\" (UID: \"5b7b0197-921b-4b00-b918-782fad0911ce\") " pod="metallb-system/frr-k8s-n642b" Oct 11 04:04:43 crc kubenswrapper[4703]: I1011 04:04:43.726499 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-68d546b9d8-b6svk"] Oct 11 04:04:43 crc kubenswrapper[4703]: I1011 04:04:43.727781 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-68d546b9d8-b6svk" Oct 11 04:04:43 crc kubenswrapper[4703]: I1011 04:04:43.729911 4703 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Oct 11 04:04:43 crc kubenswrapper[4703]: I1011 04:04:43.731418 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7fabba07-23a6-4cb6-9e64-7b1ff55e3852-cert\") pod \"frr-k8s-webhook-server-64bf5d555-sdbdc\" (UID: \"7fabba07-23a6-4cb6-9e64-7b1ff55e3852\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-sdbdc" Oct 11 04:04:43 crc kubenswrapper[4703]: I1011 04:04:43.733667 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-959f2\" (UniqueName: \"kubernetes.io/projected/7fabba07-23a6-4cb6-9e64-7b1ff55e3852-kube-api-access-959f2\") pod \"frr-k8s-webhook-server-64bf5d555-sdbdc\" (UID: \"7fabba07-23a6-4cb6-9e64-7b1ff55e3852\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-sdbdc" Oct 11 04:04:43 crc kubenswrapper[4703]: I1011 04:04:43.735201 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-68d546b9d8-b6svk"] Oct 11 04:04:43 crc kubenswrapper[4703]: I1011 04:04:43.817099 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d772ffa5-ebd2-4e95-bf1f-9f3b01719c0c-cert\") pod \"controller-68d546b9d8-b6svk\" (UID: \"d772ffa5-ebd2-4e95-bf1f-9f3b01719c0c\") " pod="metallb-system/controller-68d546b9d8-b6svk" Oct 11 04:04:43 crc kubenswrapper[4703]: I1011 04:04:43.817171 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wdvtm\" (UniqueName: \"kubernetes.io/projected/5ee9d2ec-7231-499f-818f-135260d80201-kube-api-access-wdvtm\") pod \"speaker-n8sml\" (UID: \"5ee9d2ec-7231-499f-818f-135260d80201\") " pod="metallb-system/speaker-n8sml" Oct 11 04:04:43 crc kubenswrapper[4703]: I1011 04:04:43.817207 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/5ee9d2ec-7231-499f-818f-135260d80201-metallb-excludel2\") pod \"speaker-n8sml\" (UID: \"5ee9d2ec-7231-499f-818f-135260d80201\") " pod="metallb-system/speaker-n8sml" Oct 11 04:04:43 crc kubenswrapper[4703]: I1011 04:04:43.817239 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v48ts\" (UniqueName: \"kubernetes.io/projected/d772ffa5-ebd2-4e95-bf1f-9f3b01719c0c-kube-api-access-v48ts\") pod \"controller-68d546b9d8-b6svk\" (UID: \"d772ffa5-ebd2-4e95-bf1f-9f3b01719c0c\") " pod="metallb-system/controller-68d546b9d8-b6svk" Oct 11 04:04:43 crc kubenswrapper[4703]: I1011 04:04:43.817267 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c9k6q\" (UniqueName: \"kubernetes.io/projected/5b7b0197-921b-4b00-b918-782fad0911ce-kube-api-access-c9k6q\") pod \"frr-k8s-n642b\" (UID: \"5b7b0197-921b-4b00-b918-782fad0911ce\") " pod="metallb-system/frr-k8s-n642b" Oct 11 04:04:43 crc kubenswrapper[4703]: I1011 04:04:43.817311 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/5b7b0197-921b-4b00-b918-782fad0911ce-frr-conf\") pod \"frr-k8s-n642b\" (UID: \"5b7b0197-921b-4b00-b918-782fad0911ce\") " pod="metallb-system/frr-k8s-n642b" Oct 11 04:04:43 crc kubenswrapper[4703]: I1011 04:04:43.817330 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b7b0197-921b-4b00-b918-782fad0911ce-metrics-certs\") pod \"frr-k8s-n642b\" (UID: \"5b7b0197-921b-4b00-b918-782fad0911ce\") " pod="metallb-system/frr-k8s-n642b" Oct 11 04:04:43 crc kubenswrapper[4703]: I1011 04:04:43.817351 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/5b7b0197-921b-4b00-b918-782fad0911ce-frr-sockets\") pod \"frr-k8s-n642b\" (UID: \"5b7b0197-921b-4b00-b918-782fad0911ce\") " pod="metallb-system/frr-k8s-n642b" Oct 11 04:04:43 crc kubenswrapper[4703]: I1011 04:04:43.817370 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/5b7b0197-921b-4b00-b918-782fad0911ce-reloader\") pod \"frr-k8s-n642b\" (UID: \"5b7b0197-921b-4b00-b918-782fad0911ce\") " pod="metallb-system/frr-k8s-n642b" Oct 11 04:04:43 crc kubenswrapper[4703]: I1011 04:04:43.817397 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/5b7b0197-921b-4b00-b918-782fad0911ce-frr-startup\") pod \"frr-k8s-n642b\" (UID: \"5b7b0197-921b-4b00-b918-782fad0911ce\") " pod="metallb-system/frr-k8s-n642b" Oct 11 04:04:43 crc kubenswrapper[4703]: I1011 04:04:43.817418 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d772ffa5-ebd2-4e95-bf1f-9f3b01719c0c-metrics-certs\") pod \"controller-68d546b9d8-b6svk\" (UID: \"d772ffa5-ebd2-4e95-bf1f-9f3b01719c0c\") " pod="metallb-system/controller-68d546b9d8-b6svk" Oct 11 04:04:43 crc kubenswrapper[4703]: I1011 04:04:43.817444 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5ee9d2ec-7231-499f-818f-135260d80201-metrics-certs\") pod \"speaker-n8sml\" (UID: \"5ee9d2ec-7231-499f-818f-135260d80201\") " pod="metallb-system/speaker-n8sml" Oct 11 04:04:43 crc kubenswrapper[4703]: I1011 04:04:43.817492 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/5b7b0197-921b-4b00-b918-782fad0911ce-metrics\") pod \"frr-k8s-n642b\" (UID: \"5b7b0197-921b-4b00-b918-782fad0911ce\") " pod="metallb-system/frr-k8s-n642b" Oct 11 04:04:43 crc kubenswrapper[4703]: I1011 04:04:43.817519 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/5ee9d2ec-7231-499f-818f-135260d80201-memberlist\") pod \"speaker-n8sml\" (UID: \"5ee9d2ec-7231-499f-818f-135260d80201\") " pod="metallb-system/speaker-n8sml" Oct 11 04:04:43 crc kubenswrapper[4703]: E1011 04:04:43.817653 4703 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Oct 11 04:04:43 crc kubenswrapper[4703]: E1011 04:04:43.817712 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5ee9d2ec-7231-499f-818f-135260d80201-memberlist podName:5ee9d2ec-7231-499f-818f-135260d80201 nodeName:}" failed. No retries permitted until 2025-10-11 04:04:44.317692451 +0000 UTC m=+575.528174383 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/5ee9d2ec-7231-499f-818f-135260d80201-memberlist") pod "speaker-n8sml" (UID: "5ee9d2ec-7231-499f-818f-135260d80201") : secret "metallb-memberlist" not found Oct 11 04:04:43 crc kubenswrapper[4703]: I1011 04:04:43.818901 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/5ee9d2ec-7231-499f-818f-135260d80201-metallb-excludel2\") pod \"speaker-n8sml\" (UID: \"5ee9d2ec-7231-499f-818f-135260d80201\") " pod="metallb-system/speaker-n8sml" Oct 11 04:04:43 crc kubenswrapper[4703]: I1011 04:04:43.819358 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/5b7b0197-921b-4b00-b918-782fad0911ce-frr-conf\") pod \"frr-k8s-n642b\" (UID: \"5b7b0197-921b-4b00-b918-782fad0911ce\") " pod="metallb-system/frr-k8s-n642b" Oct 11 04:04:43 crc kubenswrapper[4703]: I1011 04:04:43.820224 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/5b7b0197-921b-4b00-b918-782fad0911ce-frr-sockets\") pod \"frr-k8s-n642b\" (UID: \"5b7b0197-921b-4b00-b918-782fad0911ce\") " pod="metallb-system/frr-k8s-n642b" Oct 11 04:04:43 crc kubenswrapper[4703]: I1011 04:04:43.820513 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/5b7b0197-921b-4b00-b918-782fad0911ce-metrics\") pod \"frr-k8s-n642b\" (UID: \"5b7b0197-921b-4b00-b918-782fad0911ce\") " pod="metallb-system/frr-k8s-n642b" Oct 11 04:04:43 crc kubenswrapper[4703]: I1011 04:04:43.820773 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/5b7b0197-921b-4b00-b918-782fad0911ce-reloader\") pod \"frr-k8s-n642b\" (UID: \"5b7b0197-921b-4b00-b918-782fad0911ce\") " pod="metallb-system/frr-k8s-n642b" Oct 11 04:04:43 crc kubenswrapper[4703]: I1011 04:04:43.821569 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/5b7b0197-921b-4b00-b918-782fad0911ce-frr-startup\") pod \"frr-k8s-n642b\" (UID: \"5b7b0197-921b-4b00-b918-782fad0911ce\") " pod="metallb-system/frr-k8s-n642b" Oct 11 04:04:43 crc kubenswrapper[4703]: E1011 04:04:43.821674 4703 secret.go:188] Couldn't get secret metallb-system/speaker-certs-secret: secret "speaker-certs-secret" not found Oct 11 04:04:43 crc kubenswrapper[4703]: E1011 04:04:43.821749 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5ee9d2ec-7231-499f-818f-135260d80201-metrics-certs podName:5ee9d2ec-7231-499f-818f-135260d80201 nodeName:}" failed. No retries permitted until 2025-10-11 04:04:44.321728639 +0000 UTC m=+575.532210651 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5ee9d2ec-7231-499f-818f-135260d80201-metrics-certs") pod "speaker-n8sml" (UID: "5ee9d2ec-7231-499f-818f-135260d80201") : secret "speaker-certs-secret" not found Oct 11 04:04:43 crc kubenswrapper[4703]: I1011 04:04:43.823951 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b7b0197-921b-4b00-b918-782fad0911ce-metrics-certs\") pod \"frr-k8s-n642b\" (UID: \"5b7b0197-921b-4b00-b918-782fad0911ce\") " pod="metallb-system/frr-k8s-n642b" Oct 11 04:04:43 crc kubenswrapper[4703]: I1011 04:04:43.839098 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c9k6q\" (UniqueName: \"kubernetes.io/projected/5b7b0197-921b-4b00-b918-782fad0911ce-kube-api-access-c9k6q\") pod \"frr-k8s-n642b\" (UID: \"5b7b0197-921b-4b00-b918-782fad0911ce\") " pod="metallb-system/frr-k8s-n642b" Oct 11 04:04:43 crc kubenswrapper[4703]: I1011 04:04:43.850166 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wdvtm\" (UniqueName: \"kubernetes.io/projected/5ee9d2ec-7231-499f-818f-135260d80201-kube-api-access-wdvtm\") pod \"speaker-n8sml\" (UID: \"5ee9d2ec-7231-499f-818f-135260d80201\") " pod="metallb-system/speaker-n8sml" Oct 11 04:04:43 crc kubenswrapper[4703]: I1011 04:04:43.918965 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v48ts\" (UniqueName: \"kubernetes.io/projected/d772ffa5-ebd2-4e95-bf1f-9f3b01719c0c-kube-api-access-v48ts\") pod \"controller-68d546b9d8-b6svk\" (UID: \"d772ffa5-ebd2-4e95-bf1f-9f3b01719c0c\") " pod="metallb-system/controller-68d546b9d8-b6svk" Oct 11 04:04:43 crc kubenswrapper[4703]: I1011 04:04:43.919039 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d772ffa5-ebd2-4e95-bf1f-9f3b01719c0c-metrics-certs\") pod \"controller-68d546b9d8-b6svk\" (UID: \"d772ffa5-ebd2-4e95-bf1f-9f3b01719c0c\") " pod="metallb-system/controller-68d546b9d8-b6svk" Oct 11 04:04:43 crc kubenswrapper[4703]: I1011 04:04:43.919110 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d772ffa5-ebd2-4e95-bf1f-9f3b01719c0c-cert\") pod \"controller-68d546b9d8-b6svk\" (UID: \"d772ffa5-ebd2-4e95-bf1f-9f3b01719c0c\") " pod="metallb-system/controller-68d546b9d8-b6svk" Oct 11 04:04:43 crc kubenswrapper[4703]: E1011 04:04:43.919233 4703 secret.go:188] Couldn't get secret metallb-system/controller-certs-secret: secret "controller-certs-secret" not found Oct 11 04:04:43 crc kubenswrapper[4703]: E1011 04:04:43.919306 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d772ffa5-ebd2-4e95-bf1f-9f3b01719c0c-metrics-certs podName:d772ffa5-ebd2-4e95-bf1f-9f3b01719c0c nodeName:}" failed. No retries permitted until 2025-10-11 04:04:44.419286182 +0000 UTC m=+575.629768104 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d772ffa5-ebd2-4e95-bf1f-9f3b01719c0c-metrics-certs") pod "controller-68d546b9d8-b6svk" (UID: "d772ffa5-ebd2-4e95-bf1f-9f3b01719c0c") : secret "controller-certs-secret" not found Oct 11 04:04:43 crc kubenswrapper[4703]: I1011 04:04:43.921081 4703 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Oct 11 04:04:43 crc kubenswrapper[4703]: I1011 04:04:43.932567 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-sdbdc" Oct 11 04:04:43 crc kubenswrapper[4703]: I1011 04:04:43.933320 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d772ffa5-ebd2-4e95-bf1f-9f3b01719c0c-cert\") pod \"controller-68d546b9d8-b6svk\" (UID: \"d772ffa5-ebd2-4e95-bf1f-9f3b01719c0c\") " pod="metallb-system/controller-68d546b9d8-b6svk" Oct 11 04:04:43 crc kubenswrapper[4703]: I1011 04:04:43.940221 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v48ts\" (UniqueName: \"kubernetes.io/projected/d772ffa5-ebd2-4e95-bf1f-9f3b01719c0c-kube-api-access-v48ts\") pod \"controller-68d546b9d8-b6svk\" (UID: \"d772ffa5-ebd2-4e95-bf1f-9f3b01719c0c\") " pod="metallb-system/controller-68d546b9d8-b6svk" Oct 11 04:04:43 crc kubenswrapper[4703]: I1011 04:04:43.952179 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-n642b" Oct 11 04:04:44 crc kubenswrapper[4703]: I1011 04:04:44.154267 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-n642b" event={"ID":"5b7b0197-921b-4b00-b918-782fad0911ce","Type":"ContainerStarted","Data":"6c4eb5bf0f2c929cfaea2d01d1f0eabd19aadb2112112bcefdcb64f54a8df1ed"} Oct 11 04:04:44 crc kubenswrapper[4703]: I1011 04:04:44.191057 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-64bf5d555-sdbdc"] Oct 11 04:04:44 crc kubenswrapper[4703]: W1011 04:04:44.193823 4703 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7fabba07_23a6_4cb6_9e64_7b1ff55e3852.slice/crio-73dfec7516777b26247b4e2f42925152d7e10e2b8191aaf6d5a29eb6fe94e7ee WatchSource:0}: Error finding container 73dfec7516777b26247b4e2f42925152d7e10e2b8191aaf6d5a29eb6fe94e7ee: Status 404 returned error can't find the container with id 73dfec7516777b26247b4e2f42925152d7e10e2b8191aaf6d5a29eb6fe94e7ee Oct 11 04:04:44 crc kubenswrapper[4703]: I1011 04:04:44.324967 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5ee9d2ec-7231-499f-818f-135260d80201-metrics-certs\") pod \"speaker-n8sml\" (UID: \"5ee9d2ec-7231-499f-818f-135260d80201\") " pod="metallb-system/speaker-n8sml" Oct 11 04:04:44 crc kubenswrapper[4703]: I1011 04:04:44.325054 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/5ee9d2ec-7231-499f-818f-135260d80201-memberlist\") pod \"speaker-n8sml\" (UID: \"5ee9d2ec-7231-499f-818f-135260d80201\") " pod="metallb-system/speaker-n8sml" Oct 11 04:04:44 crc kubenswrapper[4703]: E1011 04:04:44.325272 4703 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Oct 11 04:04:44 crc kubenswrapper[4703]: E1011 04:04:44.325359 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5ee9d2ec-7231-499f-818f-135260d80201-memberlist podName:5ee9d2ec-7231-499f-818f-135260d80201 nodeName:}" failed. No retries permitted until 2025-10-11 04:04:45.325338161 +0000 UTC m=+576.535820083 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/5ee9d2ec-7231-499f-818f-135260d80201-memberlist") pod "speaker-n8sml" (UID: "5ee9d2ec-7231-499f-818f-135260d80201") : secret "metallb-memberlist" not found Oct 11 04:04:44 crc kubenswrapper[4703]: I1011 04:04:44.331487 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5ee9d2ec-7231-499f-818f-135260d80201-metrics-certs\") pod \"speaker-n8sml\" (UID: \"5ee9d2ec-7231-499f-818f-135260d80201\") " pod="metallb-system/speaker-n8sml" Oct 11 04:04:44 crc kubenswrapper[4703]: I1011 04:04:44.426100 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d772ffa5-ebd2-4e95-bf1f-9f3b01719c0c-metrics-certs\") pod \"controller-68d546b9d8-b6svk\" (UID: \"d772ffa5-ebd2-4e95-bf1f-9f3b01719c0c\") " pod="metallb-system/controller-68d546b9d8-b6svk" Oct 11 04:04:44 crc kubenswrapper[4703]: I1011 04:04:44.431287 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d772ffa5-ebd2-4e95-bf1f-9f3b01719c0c-metrics-certs\") pod \"controller-68d546b9d8-b6svk\" (UID: \"d772ffa5-ebd2-4e95-bf1f-9f3b01719c0c\") " pod="metallb-system/controller-68d546b9d8-b6svk" Oct 11 04:04:44 crc kubenswrapper[4703]: I1011 04:04:44.683403 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-68d546b9d8-b6svk" Oct 11 04:04:44 crc kubenswrapper[4703]: I1011 04:04:44.929404 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-68d546b9d8-b6svk"] Oct 11 04:04:44 crc kubenswrapper[4703]: W1011 04:04:44.937860 4703 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd772ffa5_ebd2_4e95_bf1f_9f3b01719c0c.slice/crio-493b8e25df2bdf20bf62a70ad3818b007cbed0536dfc90533673254300f7d39b WatchSource:0}: Error finding container 493b8e25df2bdf20bf62a70ad3818b007cbed0536dfc90533673254300f7d39b: Status 404 returned error can't find the container with id 493b8e25df2bdf20bf62a70ad3818b007cbed0536dfc90533673254300f7d39b Oct 11 04:04:45 crc kubenswrapper[4703]: I1011 04:04:45.162700 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-68d546b9d8-b6svk" event={"ID":"d772ffa5-ebd2-4e95-bf1f-9f3b01719c0c","Type":"ContainerStarted","Data":"493b8e25df2bdf20bf62a70ad3818b007cbed0536dfc90533673254300f7d39b"} Oct 11 04:04:45 crc kubenswrapper[4703]: I1011 04:04:45.164783 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-sdbdc" event={"ID":"7fabba07-23a6-4cb6-9e64-7b1ff55e3852","Type":"ContainerStarted","Data":"73dfec7516777b26247b4e2f42925152d7e10e2b8191aaf6d5a29eb6fe94e7ee"} Oct 11 04:04:45 crc kubenswrapper[4703]: I1011 04:04:45.340614 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/5ee9d2ec-7231-499f-818f-135260d80201-memberlist\") pod \"speaker-n8sml\" (UID: \"5ee9d2ec-7231-499f-818f-135260d80201\") " pod="metallb-system/speaker-n8sml" Oct 11 04:04:45 crc kubenswrapper[4703]: I1011 04:04:45.346376 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/5ee9d2ec-7231-499f-818f-135260d80201-memberlist\") pod \"speaker-n8sml\" (UID: \"5ee9d2ec-7231-499f-818f-135260d80201\") " pod="metallb-system/speaker-n8sml" Oct 11 04:04:45 crc kubenswrapper[4703]: I1011 04:04:45.521823 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-n8sml" Oct 11 04:04:46 crc kubenswrapper[4703]: I1011 04:04:46.173016 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-n8sml" event={"ID":"5ee9d2ec-7231-499f-818f-135260d80201","Type":"ContainerStarted","Data":"a016a17e4b5073fcb11c1d3e4e14c2f865b4e06dafe21140813fda3e55f3ecb5"} Oct 11 04:04:46 crc kubenswrapper[4703]: I1011 04:04:46.173241 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-n8sml" event={"ID":"5ee9d2ec-7231-499f-818f-135260d80201","Type":"ContainerStarted","Data":"a8134d34b3b09f9fff371f6d82397bcc678449d6a69c4e4d0a478f8eb5d91d28"} Oct 11 04:04:46 crc kubenswrapper[4703]: I1011 04:04:46.176125 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-68d546b9d8-b6svk" event={"ID":"d772ffa5-ebd2-4e95-bf1f-9f3b01719c0c","Type":"ContainerStarted","Data":"073e3615f50c2cb1763dfec8c2864a8232f6fd7f517d02969ff5cc8b45bab476"} Oct 11 04:04:50 crc kubenswrapper[4703]: I1011 04:04:50.255225 4703 patch_prober.go:28] interesting pod/machine-config-daemon-6b7d5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 11 04:04:50 crc kubenswrapper[4703]: I1011 04:04:50.255605 4703 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6b7d5" podUID="74f832fc-6791-47d6-a9b3-07d923e053dc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 11 04:04:50 crc kubenswrapper[4703]: I1011 04:04:50.255649 4703 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-6b7d5" Oct 11 04:04:50 crc kubenswrapper[4703]: I1011 04:04:50.256238 4703 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"0367c11cbaa435a21e96a60e9660edc64fff8a40838e7115f48e8069226f948f"} pod="openshift-machine-config-operator/machine-config-daemon-6b7d5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 11 04:04:50 crc kubenswrapper[4703]: I1011 04:04:50.256286 4703 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-6b7d5" podUID="74f832fc-6791-47d6-a9b3-07d923e053dc" containerName="machine-config-daemon" containerID="cri-o://0367c11cbaa435a21e96a60e9660edc64fff8a40838e7115f48e8069226f948f" gracePeriod=600 Oct 11 04:04:51 crc kubenswrapper[4703]: I1011 04:04:51.214737 4703 generic.go:334] "Generic (PLEG): container finished" podID="74f832fc-6791-47d6-a9b3-07d923e053dc" containerID="0367c11cbaa435a21e96a60e9660edc64fff8a40838e7115f48e8069226f948f" exitCode=0 Oct 11 04:04:51 crc kubenswrapper[4703]: I1011 04:04:51.214808 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6b7d5" event={"ID":"74f832fc-6791-47d6-a9b3-07d923e053dc","Type":"ContainerDied","Data":"0367c11cbaa435a21e96a60e9660edc64fff8a40838e7115f48e8069226f948f"} Oct 11 04:04:51 crc kubenswrapper[4703]: I1011 04:04:51.215236 4703 scope.go:117] "RemoveContainer" containerID="8f08e932c3a4125c9b6b2754aa44c21fa7de5a4f6158a78e3dc1896527e30434" Oct 11 04:04:52 crc kubenswrapper[4703]: I1011 04:04:52.220686 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-n8sml" event={"ID":"5ee9d2ec-7231-499f-818f-135260d80201","Type":"ContainerStarted","Data":"727c236def48afc47e92e1c003511c4053550b5d3f64f5c787e4e14b439fabae"} Oct 11 04:04:52 crc kubenswrapper[4703]: I1011 04:04:52.221265 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-n8sml" Oct 11 04:04:52 crc kubenswrapper[4703]: I1011 04:04:52.223058 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6b7d5" event={"ID":"74f832fc-6791-47d6-a9b3-07d923e053dc","Type":"ContainerStarted","Data":"0e380b9b01cda230cd0b9c8f75d2fe209660f6bd6b7bddb2546a440639390852"} Oct 11 04:04:52 crc kubenswrapper[4703]: I1011 04:04:52.225037 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-68d546b9d8-b6svk" event={"ID":"d772ffa5-ebd2-4e95-bf1f-9f3b01719c0c","Type":"ContainerStarted","Data":"f4a0e7568756b674dd9742af51e90fb5069456f8ede40ddcbbee515aac650a97"} Oct 11 04:04:52 crc kubenswrapper[4703]: I1011 04:04:52.225307 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-68d546b9d8-b6svk" Oct 11 04:04:52 crc kubenswrapper[4703]: I1011 04:04:52.226161 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-sdbdc" event={"ID":"7fabba07-23a6-4cb6-9e64-7b1ff55e3852","Type":"ContainerStarted","Data":"3c4dc3a8edfce3e68d916dc50190f44b9efb295372d1189a1a9b734eb8e1a21a"} Oct 11 04:04:52 crc kubenswrapper[4703]: I1011 04:04:52.226539 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-sdbdc" Oct 11 04:04:52 crc kubenswrapper[4703]: I1011 04:04:52.227711 4703 generic.go:334] "Generic (PLEG): container finished" podID="5b7b0197-921b-4b00-b918-782fad0911ce" containerID="af0fddc57223510c7fa372b57af679d730b11c2104956eb10477785fbb77bddb" exitCode=0 Oct 11 04:04:52 crc kubenswrapper[4703]: I1011 04:04:52.227738 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-n642b" event={"ID":"5b7b0197-921b-4b00-b918-782fad0911ce","Type":"ContainerDied","Data":"af0fddc57223510c7fa372b57af679d730b11c2104956eb10477785fbb77bddb"} Oct 11 04:04:52 crc kubenswrapper[4703]: I1011 04:04:52.251590 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-n8sml" podStartSLOduration=3.731338699 podStartE2EDuration="9.25157362s" podCreationTimestamp="2025-10-11 04:04:43 +0000 UTC" firstStartedPulling="2025-10-11 04:04:45.904309212 +0000 UTC m=+577.114791134" lastFinishedPulling="2025-10-11 04:04:51.424544093 +0000 UTC m=+582.635026055" observedRunningTime="2025-10-11 04:04:52.247186513 +0000 UTC m=+583.457668435" watchObservedRunningTime="2025-10-11 04:04:52.25157362 +0000 UTC m=+583.462055542" Oct 11 04:04:52 crc kubenswrapper[4703]: I1011 04:04:52.272231 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-68d546b9d8-b6svk" podStartSLOduration=3.029408631 podStartE2EDuration="9.272217701s" podCreationTimestamp="2025-10-11 04:04:43 +0000 UTC" firstStartedPulling="2025-10-11 04:04:45.181744003 +0000 UTC m=+576.392225965" lastFinishedPulling="2025-10-11 04:04:51.424553073 +0000 UTC m=+582.635035035" observedRunningTime="2025-10-11 04:04:52.268451861 +0000 UTC m=+583.478933783" watchObservedRunningTime="2025-10-11 04:04:52.272217701 +0000 UTC m=+583.482699613" Oct 11 04:04:52 crc kubenswrapper[4703]: I1011 04:04:52.297650 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-sdbdc" podStartSLOduration=2.031495372 podStartE2EDuration="9.29763555s" podCreationTimestamp="2025-10-11 04:04:43 +0000 UTC" firstStartedPulling="2025-10-11 04:04:44.195886816 +0000 UTC m=+575.406368738" lastFinishedPulling="2025-10-11 04:04:51.462026954 +0000 UTC m=+582.672508916" observedRunningTime="2025-10-11 04:04:52.295638037 +0000 UTC m=+583.506119979" watchObservedRunningTime="2025-10-11 04:04:52.29763555 +0000 UTC m=+583.508117472" Oct 11 04:04:53 crc kubenswrapper[4703]: I1011 04:04:53.234953 4703 generic.go:334] "Generic (PLEG): container finished" podID="5b7b0197-921b-4b00-b918-782fad0911ce" containerID="24c2802c26479d7c7dd6309ed48c8e40e8991e75cc41796a39ba8887537fca51" exitCode=0 Oct 11 04:04:53 crc kubenswrapper[4703]: I1011 04:04:53.235029 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-n642b" event={"ID":"5b7b0197-921b-4b00-b918-782fad0911ce","Type":"ContainerDied","Data":"24c2802c26479d7c7dd6309ed48c8e40e8991e75cc41796a39ba8887537fca51"} Oct 11 04:04:54 crc kubenswrapper[4703]: I1011 04:04:54.247497 4703 generic.go:334] "Generic (PLEG): container finished" podID="5b7b0197-921b-4b00-b918-782fad0911ce" containerID="f816a69a54e6fa7daf7360eec24ea8755801a9a7dca14bd0f8332bfa83498e39" exitCode=0 Oct 11 04:04:54 crc kubenswrapper[4703]: I1011 04:04:54.247589 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-n642b" event={"ID":"5b7b0197-921b-4b00-b918-782fad0911ce","Type":"ContainerDied","Data":"f816a69a54e6fa7daf7360eec24ea8755801a9a7dca14bd0f8332bfa83498e39"} Oct 11 04:04:55 crc kubenswrapper[4703]: I1011 04:04:55.259066 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-n642b" event={"ID":"5b7b0197-921b-4b00-b918-782fad0911ce","Type":"ContainerStarted","Data":"8e9ae1ce3cd2ea70298187d575db9108036e7ee75b3f0357a2028769ec5733ca"} Oct 11 04:04:55 crc kubenswrapper[4703]: I1011 04:04:55.260053 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-n642b" event={"ID":"5b7b0197-921b-4b00-b918-782fad0911ce","Type":"ContainerStarted","Data":"b75c2aafa185993f41f439721f17e1acb6aa5e4f272ab7982d66b84dd16c35e1"} Oct 11 04:04:55 crc kubenswrapper[4703]: I1011 04:04:55.260067 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-n642b" event={"ID":"5b7b0197-921b-4b00-b918-782fad0911ce","Type":"ContainerStarted","Data":"7a5b8272d8a1fc153aa0d862dca5b9982d0e851dc1972746eb242b1063dc917a"} Oct 11 04:04:55 crc kubenswrapper[4703]: I1011 04:04:55.260077 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-n642b" event={"ID":"5b7b0197-921b-4b00-b918-782fad0911ce","Type":"ContainerStarted","Data":"84143cf4951145a4dce55a135f6c894611d0a6a3cdda6a15108d7b0f04a92f86"} Oct 11 04:04:55 crc kubenswrapper[4703]: I1011 04:04:55.526397 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-n8sml" Oct 11 04:04:56 crc kubenswrapper[4703]: I1011 04:04:56.272894 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-n642b" event={"ID":"5b7b0197-921b-4b00-b918-782fad0911ce","Type":"ContainerStarted","Data":"223256e9be337dc1d419199a5810054bac5c6f123770378c56ce11d659dec3b8"} Oct 11 04:04:56 crc kubenswrapper[4703]: I1011 04:04:56.273288 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-n642b" Oct 11 04:04:56 crc kubenswrapper[4703]: I1011 04:04:56.273310 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-n642b" event={"ID":"5b7b0197-921b-4b00-b918-782fad0911ce","Type":"ContainerStarted","Data":"7ea82877a193e3e92af1d1f71f4465f48ac182784ea631994062e6e0d4ed182a"} Oct 11 04:04:56 crc kubenswrapper[4703]: I1011 04:04:56.311679 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-n642b" podStartSLOduration=5.985141419 podStartE2EDuration="13.311651668s" podCreationTimestamp="2025-10-11 04:04:43 +0000 UTC" firstStartedPulling="2025-10-11 04:04:44.098147167 +0000 UTC m=+575.308629089" lastFinishedPulling="2025-10-11 04:04:51.424657376 +0000 UTC m=+582.635139338" observedRunningTime="2025-10-11 04:04:56.307205851 +0000 UTC m=+587.517687783" watchObservedRunningTime="2025-10-11 04:04:56.311651668 +0000 UTC m=+587.522133620" Oct 11 04:04:58 crc kubenswrapper[4703]: I1011 04:04:58.953955 4703 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-n642b" Oct 11 04:04:59 crc kubenswrapper[4703]: I1011 04:04:59.026925 4703 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-n642b" Oct 11 04:05:01 crc kubenswrapper[4703]: I1011 04:05:01.827588 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-index-599xn"] Oct 11 04:05:01 crc kubenswrapper[4703]: I1011 04:05:01.828682 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-index-599xn" Oct 11 04:05:01 crc kubenswrapper[4703]: I1011 04:05:01.831276 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Oct 11 04:05:01 crc kubenswrapper[4703]: I1011 04:05:01.831281 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-index-dockercfg-qxbxq" Oct 11 04:05:01 crc kubenswrapper[4703]: I1011 04:05:01.832785 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Oct 11 04:05:01 crc kubenswrapper[4703]: I1011 04:05:01.846546 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-index-599xn"] Oct 11 04:05:01 crc kubenswrapper[4703]: I1011 04:05:01.879168 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-skpvt\" (UniqueName: \"kubernetes.io/projected/c0ea5fea-d789-4a76-b9ec-6fc00e0aff57-kube-api-access-skpvt\") pod \"mariadb-operator-index-599xn\" (UID: \"c0ea5fea-d789-4a76-b9ec-6fc00e0aff57\") " pod="openstack-operators/mariadb-operator-index-599xn" Oct 11 04:05:01 crc kubenswrapper[4703]: I1011 04:05:01.979964 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-skpvt\" (UniqueName: \"kubernetes.io/projected/c0ea5fea-d789-4a76-b9ec-6fc00e0aff57-kube-api-access-skpvt\") pod \"mariadb-operator-index-599xn\" (UID: \"c0ea5fea-d789-4a76-b9ec-6fc00e0aff57\") " pod="openstack-operators/mariadb-operator-index-599xn" Oct 11 04:05:02 crc kubenswrapper[4703]: I1011 04:05:02.010333 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-skpvt\" (UniqueName: \"kubernetes.io/projected/c0ea5fea-d789-4a76-b9ec-6fc00e0aff57-kube-api-access-skpvt\") pod \"mariadb-operator-index-599xn\" (UID: \"c0ea5fea-d789-4a76-b9ec-6fc00e0aff57\") " pod="openstack-operators/mariadb-operator-index-599xn" Oct 11 04:05:02 crc kubenswrapper[4703]: I1011 04:05:02.147774 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-index-599xn" Oct 11 04:05:02 crc kubenswrapper[4703]: I1011 04:05:02.553251 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-index-599xn"] Oct 11 04:05:02 crc kubenswrapper[4703]: W1011 04:05:02.560768 4703 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc0ea5fea_d789_4a76_b9ec_6fc00e0aff57.slice/crio-5e5cef3a2d94e5fd27a3c7b8e59786fb59ca22c74728b42319f0198d6e9c5fd7 WatchSource:0}: Error finding container 5e5cef3a2d94e5fd27a3c7b8e59786fb59ca22c74728b42319f0198d6e9c5fd7: Status 404 returned error can't find the container with id 5e5cef3a2d94e5fd27a3c7b8e59786fb59ca22c74728b42319f0198d6e9c5fd7 Oct 11 04:05:03 crc kubenswrapper[4703]: I1011 04:05:03.327932 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-index-599xn" event={"ID":"c0ea5fea-d789-4a76-b9ec-6fc00e0aff57","Type":"ContainerStarted","Data":"5e5cef3a2d94e5fd27a3c7b8e59786fb59ca22c74728b42319f0198d6e9c5fd7"} Oct 11 04:05:03 crc kubenswrapper[4703]: I1011 04:05:03.941883 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-sdbdc" Oct 11 04:05:04 crc kubenswrapper[4703]: I1011 04:05:04.336731 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-index-599xn" event={"ID":"c0ea5fea-d789-4a76-b9ec-6fc00e0aff57","Type":"ContainerStarted","Data":"c366db570eaddba93bfd833780842faf84a8528c6266a63f93a492fb5e9d4914"} Oct 11 04:05:04 crc kubenswrapper[4703]: I1011 04:05:04.357327 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-index-599xn" podStartSLOduration=2.4114205220000002 podStartE2EDuration="3.35729484s" podCreationTimestamp="2025-10-11 04:05:01 +0000 UTC" firstStartedPulling="2025-10-11 04:05:02.564353824 +0000 UTC m=+593.774835786" lastFinishedPulling="2025-10-11 04:05:03.510228172 +0000 UTC m=+594.720710104" observedRunningTime="2025-10-11 04:05:04.351416174 +0000 UTC m=+595.561898116" watchObservedRunningTime="2025-10-11 04:05:04.35729484 +0000 UTC m=+595.567776802" Oct 11 04:05:04 crc kubenswrapper[4703]: I1011 04:05:04.689006 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-68d546b9d8-b6svk" Oct 11 04:05:05 crc kubenswrapper[4703]: I1011 04:05:05.200997 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/mariadb-operator-index-599xn"] Oct 11 04:05:05 crc kubenswrapper[4703]: I1011 04:05:05.811376 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-index-t642v"] Oct 11 04:05:05 crc kubenswrapper[4703]: I1011 04:05:05.813083 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-index-t642v" Oct 11 04:05:05 crc kubenswrapper[4703]: I1011 04:05:05.822855 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-index-t642v"] Oct 11 04:05:05 crc kubenswrapper[4703]: I1011 04:05:05.848131 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rkbxc\" (UniqueName: \"kubernetes.io/projected/2cd95ee2-d109-4d57-a3b4-f0c841741119-kube-api-access-rkbxc\") pod \"mariadb-operator-index-t642v\" (UID: \"2cd95ee2-d109-4d57-a3b4-f0c841741119\") " pod="openstack-operators/mariadb-operator-index-t642v" Oct 11 04:05:05 crc kubenswrapper[4703]: I1011 04:05:05.950176 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rkbxc\" (UniqueName: \"kubernetes.io/projected/2cd95ee2-d109-4d57-a3b4-f0c841741119-kube-api-access-rkbxc\") pod \"mariadb-operator-index-t642v\" (UID: \"2cd95ee2-d109-4d57-a3b4-f0c841741119\") " pod="openstack-operators/mariadb-operator-index-t642v" Oct 11 04:05:05 crc kubenswrapper[4703]: I1011 04:05:05.983695 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rkbxc\" (UniqueName: \"kubernetes.io/projected/2cd95ee2-d109-4d57-a3b4-f0c841741119-kube-api-access-rkbxc\") pod \"mariadb-operator-index-t642v\" (UID: \"2cd95ee2-d109-4d57-a3b4-f0c841741119\") " pod="openstack-operators/mariadb-operator-index-t642v" Oct 11 04:05:06 crc kubenswrapper[4703]: I1011 04:05:06.175047 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-index-t642v" Oct 11 04:05:06 crc kubenswrapper[4703]: I1011 04:05:06.354298 4703 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/mariadb-operator-index-599xn" podUID="c0ea5fea-d789-4a76-b9ec-6fc00e0aff57" containerName="registry-server" containerID="cri-o://c366db570eaddba93bfd833780842faf84a8528c6266a63f93a492fb5e9d4914" gracePeriod=2 Oct 11 04:05:06 crc kubenswrapper[4703]: I1011 04:05:06.382592 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-index-t642v"] Oct 11 04:05:06 crc kubenswrapper[4703]: W1011 04:05:06.465406 4703 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2cd95ee2_d109_4d57_a3b4_f0c841741119.slice/crio-20156d44883439f9c006af32b74e185ff7b965140eede835e833b11fdc7a1439 WatchSource:0}: Error finding container 20156d44883439f9c006af32b74e185ff7b965140eede835e833b11fdc7a1439: Status 404 returned error can't find the container with id 20156d44883439f9c006af32b74e185ff7b965140eede835e833b11fdc7a1439 Oct 11 04:05:06 crc kubenswrapper[4703]: I1011 04:05:06.723450 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-index-599xn" Oct 11 04:05:06 crc kubenswrapper[4703]: I1011 04:05:06.764948 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-skpvt\" (UniqueName: \"kubernetes.io/projected/c0ea5fea-d789-4a76-b9ec-6fc00e0aff57-kube-api-access-skpvt\") pod \"c0ea5fea-d789-4a76-b9ec-6fc00e0aff57\" (UID: \"c0ea5fea-d789-4a76-b9ec-6fc00e0aff57\") " Oct 11 04:05:06 crc kubenswrapper[4703]: I1011 04:05:06.771060 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c0ea5fea-d789-4a76-b9ec-6fc00e0aff57-kube-api-access-skpvt" (OuterVolumeSpecName: "kube-api-access-skpvt") pod "c0ea5fea-d789-4a76-b9ec-6fc00e0aff57" (UID: "c0ea5fea-d789-4a76-b9ec-6fc00e0aff57"). InnerVolumeSpecName "kube-api-access-skpvt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 04:05:06 crc kubenswrapper[4703]: I1011 04:05:06.867447 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-skpvt\" (UniqueName: \"kubernetes.io/projected/c0ea5fea-d789-4a76-b9ec-6fc00e0aff57-kube-api-access-skpvt\") on node \"crc\" DevicePath \"\"" Oct 11 04:05:07 crc kubenswrapper[4703]: I1011 04:05:07.365134 4703 generic.go:334] "Generic (PLEG): container finished" podID="c0ea5fea-d789-4a76-b9ec-6fc00e0aff57" containerID="c366db570eaddba93bfd833780842faf84a8528c6266a63f93a492fb5e9d4914" exitCode=0 Oct 11 04:05:07 crc kubenswrapper[4703]: I1011 04:05:07.365224 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-index-599xn" Oct 11 04:05:07 crc kubenswrapper[4703]: I1011 04:05:07.365218 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-index-599xn" event={"ID":"c0ea5fea-d789-4a76-b9ec-6fc00e0aff57","Type":"ContainerDied","Data":"c366db570eaddba93bfd833780842faf84a8528c6266a63f93a492fb5e9d4914"} Oct 11 04:05:07 crc kubenswrapper[4703]: I1011 04:05:07.366184 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-index-599xn" event={"ID":"c0ea5fea-d789-4a76-b9ec-6fc00e0aff57","Type":"ContainerDied","Data":"5e5cef3a2d94e5fd27a3c7b8e59786fb59ca22c74728b42319f0198d6e9c5fd7"} Oct 11 04:05:07 crc kubenswrapper[4703]: I1011 04:05:07.366230 4703 scope.go:117] "RemoveContainer" containerID="c366db570eaddba93bfd833780842faf84a8528c6266a63f93a492fb5e9d4914" Oct 11 04:05:07 crc kubenswrapper[4703]: I1011 04:05:07.369682 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-index-t642v" event={"ID":"2cd95ee2-d109-4d57-a3b4-f0c841741119","Type":"ContainerStarted","Data":"a4ebb7ac10d3f0ba46918c1b8a5c4df3f32af5942f98dfc0f2eb05473864c9a2"} Oct 11 04:05:07 crc kubenswrapper[4703]: I1011 04:05:07.369725 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-index-t642v" event={"ID":"2cd95ee2-d109-4d57-a3b4-f0c841741119","Type":"ContainerStarted","Data":"20156d44883439f9c006af32b74e185ff7b965140eede835e833b11fdc7a1439"} Oct 11 04:05:07 crc kubenswrapper[4703]: I1011 04:05:07.395939 4703 scope.go:117] "RemoveContainer" containerID="c366db570eaddba93bfd833780842faf84a8528c6266a63f93a492fb5e9d4914" Oct 11 04:05:07 crc kubenswrapper[4703]: E1011 04:05:07.396906 4703 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c366db570eaddba93bfd833780842faf84a8528c6266a63f93a492fb5e9d4914\": container with ID starting with c366db570eaddba93bfd833780842faf84a8528c6266a63f93a492fb5e9d4914 not found: ID does not exist" containerID="c366db570eaddba93bfd833780842faf84a8528c6266a63f93a492fb5e9d4914" Oct 11 04:05:07 crc kubenswrapper[4703]: I1011 04:05:07.396955 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c366db570eaddba93bfd833780842faf84a8528c6266a63f93a492fb5e9d4914"} err="failed to get container status \"c366db570eaddba93bfd833780842faf84a8528c6266a63f93a492fb5e9d4914\": rpc error: code = NotFound desc = could not find container \"c366db570eaddba93bfd833780842faf84a8528c6266a63f93a492fb5e9d4914\": container with ID starting with c366db570eaddba93bfd833780842faf84a8528c6266a63f93a492fb5e9d4914 not found: ID does not exist" Oct 11 04:05:07 crc kubenswrapper[4703]: I1011 04:05:07.402589 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-index-t642v" podStartSLOduration=1.8819140060000001 podStartE2EDuration="2.402562684s" podCreationTimestamp="2025-10-11 04:05:05 +0000 UTC" firstStartedPulling="2025-10-11 04:05:06.472837115 +0000 UTC m=+597.683319057" lastFinishedPulling="2025-10-11 04:05:06.993485813 +0000 UTC m=+598.203967735" observedRunningTime="2025-10-11 04:05:07.394388859 +0000 UTC m=+598.604870861" watchObservedRunningTime="2025-10-11 04:05:07.402562684 +0000 UTC m=+598.613044636" Oct 11 04:05:07 crc kubenswrapper[4703]: I1011 04:05:07.424708 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/mariadb-operator-index-599xn"] Oct 11 04:05:07 crc kubenswrapper[4703]: I1011 04:05:07.430588 4703 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/mariadb-operator-index-599xn"] Oct 11 04:05:07 crc kubenswrapper[4703]: I1011 04:05:07.546845 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c0ea5fea-d789-4a76-b9ec-6fc00e0aff57" path="/var/lib/kubelet/pods/c0ea5fea-d789-4a76-b9ec-6fc00e0aff57/volumes" Oct 11 04:05:13 crc kubenswrapper[4703]: I1011 04:05:13.958683 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-n642b" Oct 11 04:05:16 crc kubenswrapper[4703]: I1011 04:05:16.175628 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-index-t642v" Oct 11 04:05:16 crc kubenswrapper[4703]: I1011 04:05:16.175708 4703 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/mariadb-operator-index-t642v" Oct 11 04:05:16 crc kubenswrapper[4703]: I1011 04:05:16.221784 4703 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/mariadb-operator-index-t642v" Oct 11 04:05:16 crc kubenswrapper[4703]: I1011 04:05:16.484063 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-index-t642v" Oct 11 04:05:24 crc kubenswrapper[4703]: I1011 04:05:24.354768 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/10f57ec60de4df58ed39de93369cc80174e5ad08476bc9cf01944ad890zc2gm"] Oct 11 04:05:24 crc kubenswrapper[4703]: E1011 04:05:24.355393 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0ea5fea-d789-4a76-b9ec-6fc00e0aff57" containerName="registry-server" Oct 11 04:05:24 crc kubenswrapper[4703]: I1011 04:05:24.355416 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0ea5fea-d789-4a76-b9ec-6fc00e0aff57" containerName="registry-server" Oct 11 04:05:24 crc kubenswrapper[4703]: I1011 04:05:24.355663 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="c0ea5fea-d789-4a76-b9ec-6fc00e0aff57" containerName="registry-server" Oct 11 04:05:24 crc kubenswrapper[4703]: I1011 04:05:24.357068 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/10f57ec60de4df58ed39de93369cc80174e5ad08476bc9cf01944ad890zc2gm" Oct 11 04:05:24 crc kubenswrapper[4703]: I1011 04:05:24.359282 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-vxrk2" Oct 11 04:05:24 crc kubenswrapper[4703]: I1011 04:05:24.372637 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/10f57ec60de4df58ed39de93369cc80174e5ad08476bc9cf01944ad890zc2gm"] Oct 11 04:05:24 crc kubenswrapper[4703]: I1011 04:05:24.415003 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7e7951b7-f9d2-4e34-a4ec-c64753ec9a26-util\") pod \"10f57ec60de4df58ed39de93369cc80174e5ad08476bc9cf01944ad890zc2gm\" (UID: \"7e7951b7-f9d2-4e34-a4ec-c64753ec9a26\") " pod="openstack-operators/10f57ec60de4df58ed39de93369cc80174e5ad08476bc9cf01944ad890zc2gm" Oct 11 04:05:24 crc kubenswrapper[4703]: I1011 04:05:24.415216 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s292t\" (UniqueName: \"kubernetes.io/projected/7e7951b7-f9d2-4e34-a4ec-c64753ec9a26-kube-api-access-s292t\") pod \"10f57ec60de4df58ed39de93369cc80174e5ad08476bc9cf01944ad890zc2gm\" (UID: \"7e7951b7-f9d2-4e34-a4ec-c64753ec9a26\") " pod="openstack-operators/10f57ec60de4df58ed39de93369cc80174e5ad08476bc9cf01944ad890zc2gm" Oct 11 04:05:24 crc kubenswrapper[4703]: I1011 04:05:24.415281 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7e7951b7-f9d2-4e34-a4ec-c64753ec9a26-bundle\") pod \"10f57ec60de4df58ed39de93369cc80174e5ad08476bc9cf01944ad890zc2gm\" (UID: \"7e7951b7-f9d2-4e34-a4ec-c64753ec9a26\") " pod="openstack-operators/10f57ec60de4df58ed39de93369cc80174e5ad08476bc9cf01944ad890zc2gm" Oct 11 04:05:24 crc kubenswrapper[4703]: I1011 04:05:24.516660 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s292t\" (UniqueName: \"kubernetes.io/projected/7e7951b7-f9d2-4e34-a4ec-c64753ec9a26-kube-api-access-s292t\") pod \"10f57ec60de4df58ed39de93369cc80174e5ad08476bc9cf01944ad890zc2gm\" (UID: \"7e7951b7-f9d2-4e34-a4ec-c64753ec9a26\") " pod="openstack-operators/10f57ec60de4df58ed39de93369cc80174e5ad08476bc9cf01944ad890zc2gm" Oct 11 04:05:24 crc kubenswrapper[4703]: I1011 04:05:24.516750 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7e7951b7-f9d2-4e34-a4ec-c64753ec9a26-bundle\") pod \"10f57ec60de4df58ed39de93369cc80174e5ad08476bc9cf01944ad890zc2gm\" (UID: \"7e7951b7-f9d2-4e34-a4ec-c64753ec9a26\") " pod="openstack-operators/10f57ec60de4df58ed39de93369cc80174e5ad08476bc9cf01944ad890zc2gm" Oct 11 04:05:24 crc kubenswrapper[4703]: I1011 04:05:24.516858 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7e7951b7-f9d2-4e34-a4ec-c64753ec9a26-util\") pod \"10f57ec60de4df58ed39de93369cc80174e5ad08476bc9cf01944ad890zc2gm\" (UID: \"7e7951b7-f9d2-4e34-a4ec-c64753ec9a26\") " pod="openstack-operators/10f57ec60de4df58ed39de93369cc80174e5ad08476bc9cf01944ad890zc2gm" Oct 11 04:05:24 crc kubenswrapper[4703]: I1011 04:05:24.517599 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7e7951b7-f9d2-4e34-a4ec-c64753ec9a26-util\") pod \"10f57ec60de4df58ed39de93369cc80174e5ad08476bc9cf01944ad890zc2gm\" (UID: \"7e7951b7-f9d2-4e34-a4ec-c64753ec9a26\") " pod="openstack-operators/10f57ec60de4df58ed39de93369cc80174e5ad08476bc9cf01944ad890zc2gm" Oct 11 04:05:24 crc kubenswrapper[4703]: I1011 04:05:24.517936 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7e7951b7-f9d2-4e34-a4ec-c64753ec9a26-bundle\") pod \"10f57ec60de4df58ed39de93369cc80174e5ad08476bc9cf01944ad890zc2gm\" (UID: \"7e7951b7-f9d2-4e34-a4ec-c64753ec9a26\") " pod="openstack-operators/10f57ec60de4df58ed39de93369cc80174e5ad08476bc9cf01944ad890zc2gm" Oct 11 04:05:24 crc kubenswrapper[4703]: I1011 04:05:24.552046 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s292t\" (UniqueName: \"kubernetes.io/projected/7e7951b7-f9d2-4e34-a4ec-c64753ec9a26-kube-api-access-s292t\") pod \"10f57ec60de4df58ed39de93369cc80174e5ad08476bc9cf01944ad890zc2gm\" (UID: \"7e7951b7-f9d2-4e34-a4ec-c64753ec9a26\") " pod="openstack-operators/10f57ec60de4df58ed39de93369cc80174e5ad08476bc9cf01944ad890zc2gm" Oct 11 04:05:24 crc kubenswrapper[4703]: I1011 04:05:24.676367 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/10f57ec60de4df58ed39de93369cc80174e5ad08476bc9cf01944ad890zc2gm" Oct 11 04:05:25 crc kubenswrapper[4703]: I1011 04:05:25.182653 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/10f57ec60de4df58ed39de93369cc80174e5ad08476bc9cf01944ad890zc2gm"] Oct 11 04:05:25 crc kubenswrapper[4703]: I1011 04:05:25.503890 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/10f57ec60de4df58ed39de93369cc80174e5ad08476bc9cf01944ad890zc2gm" event={"ID":"7e7951b7-f9d2-4e34-a4ec-c64753ec9a26","Type":"ContainerStarted","Data":"ab6f6a0595be9f87375153dbaad7d82730c0fb6bb7c0fa4b025e0dbb16026d2c"} Oct 11 04:05:26 crc kubenswrapper[4703]: I1011 04:05:26.513780 4703 generic.go:334] "Generic (PLEG): container finished" podID="7e7951b7-f9d2-4e34-a4ec-c64753ec9a26" containerID="43768901e33d137c417d42c678314cefa8a5b7ff2125721cf7ab8674c08e8ffd" exitCode=0 Oct 11 04:05:26 crc kubenswrapper[4703]: I1011 04:05:26.513843 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/10f57ec60de4df58ed39de93369cc80174e5ad08476bc9cf01944ad890zc2gm" event={"ID":"7e7951b7-f9d2-4e34-a4ec-c64753ec9a26","Type":"ContainerDied","Data":"43768901e33d137c417d42c678314cefa8a5b7ff2125721cf7ab8674c08e8ffd"} Oct 11 04:05:28 crc kubenswrapper[4703]: E1011 04:05:28.413160 4703 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7e7951b7_f9d2_4e34_a4ec_c64753ec9a26.slice/crio-conmon-542b4bf0f6e00dde0925e4e92faadcbd23169fb02818341267a2de0a80debff1.scope\": RecentStats: unable to find data in memory cache]" Oct 11 04:05:28 crc kubenswrapper[4703]: I1011 04:05:28.532736 4703 generic.go:334] "Generic (PLEG): container finished" podID="7e7951b7-f9d2-4e34-a4ec-c64753ec9a26" containerID="542b4bf0f6e00dde0925e4e92faadcbd23169fb02818341267a2de0a80debff1" exitCode=0 Oct 11 04:05:28 crc kubenswrapper[4703]: I1011 04:05:28.532813 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/10f57ec60de4df58ed39de93369cc80174e5ad08476bc9cf01944ad890zc2gm" event={"ID":"7e7951b7-f9d2-4e34-a4ec-c64753ec9a26","Type":"ContainerDied","Data":"542b4bf0f6e00dde0925e4e92faadcbd23169fb02818341267a2de0a80debff1"} Oct 11 04:05:29 crc kubenswrapper[4703]: I1011 04:05:29.544523 4703 generic.go:334] "Generic (PLEG): container finished" podID="7e7951b7-f9d2-4e34-a4ec-c64753ec9a26" containerID="ac296bd9e14038905dceddbbc15f171a426f5748cfd7a5b8812325523319f3b1" exitCode=0 Oct 11 04:05:29 crc kubenswrapper[4703]: I1011 04:05:29.548929 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/10f57ec60de4df58ed39de93369cc80174e5ad08476bc9cf01944ad890zc2gm" event={"ID":"7e7951b7-f9d2-4e34-a4ec-c64753ec9a26","Type":"ContainerDied","Data":"ac296bd9e14038905dceddbbc15f171a426f5748cfd7a5b8812325523319f3b1"} Oct 11 04:05:30 crc kubenswrapper[4703]: I1011 04:05:30.924203 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/10f57ec60de4df58ed39de93369cc80174e5ad08476bc9cf01944ad890zc2gm" Oct 11 04:05:31 crc kubenswrapper[4703]: I1011 04:05:31.110398 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s292t\" (UniqueName: \"kubernetes.io/projected/7e7951b7-f9d2-4e34-a4ec-c64753ec9a26-kube-api-access-s292t\") pod \"7e7951b7-f9d2-4e34-a4ec-c64753ec9a26\" (UID: \"7e7951b7-f9d2-4e34-a4ec-c64753ec9a26\") " Oct 11 04:05:31 crc kubenswrapper[4703]: I1011 04:05:31.110542 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7e7951b7-f9d2-4e34-a4ec-c64753ec9a26-util\") pod \"7e7951b7-f9d2-4e34-a4ec-c64753ec9a26\" (UID: \"7e7951b7-f9d2-4e34-a4ec-c64753ec9a26\") " Oct 11 04:05:31 crc kubenswrapper[4703]: I1011 04:05:31.110717 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7e7951b7-f9d2-4e34-a4ec-c64753ec9a26-bundle\") pod \"7e7951b7-f9d2-4e34-a4ec-c64753ec9a26\" (UID: \"7e7951b7-f9d2-4e34-a4ec-c64753ec9a26\") " Oct 11 04:05:31 crc kubenswrapper[4703]: I1011 04:05:31.112124 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7e7951b7-f9d2-4e34-a4ec-c64753ec9a26-bundle" (OuterVolumeSpecName: "bundle") pod "7e7951b7-f9d2-4e34-a4ec-c64753ec9a26" (UID: "7e7951b7-f9d2-4e34-a4ec-c64753ec9a26"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 04:05:31 crc kubenswrapper[4703]: I1011 04:05:31.121008 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e7951b7-f9d2-4e34-a4ec-c64753ec9a26-kube-api-access-s292t" (OuterVolumeSpecName: "kube-api-access-s292t") pod "7e7951b7-f9d2-4e34-a4ec-c64753ec9a26" (UID: "7e7951b7-f9d2-4e34-a4ec-c64753ec9a26"). InnerVolumeSpecName "kube-api-access-s292t". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 04:05:31 crc kubenswrapper[4703]: I1011 04:05:31.212590 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s292t\" (UniqueName: \"kubernetes.io/projected/7e7951b7-f9d2-4e34-a4ec-c64753ec9a26-kube-api-access-s292t\") on node \"crc\" DevicePath \"\"" Oct 11 04:05:31 crc kubenswrapper[4703]: I1011 04:05:31.213006 4703 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7e7951b7-f9d2-4e34-a4ec-c64753ec9a26-bundle\") on node \"crc\" DevicePath \"\"" Oct 11 04:05:31 crc kubenswrapper[4703]: I1011 04:05:31.350668 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7e7951b7-f9d2-4e34-a4ec-c64753ec9a26-util" (OuterVolumeSpecName: "util") pod "7e7951b7-f9d2-4e34-a4ec-c64753ec9a26" (UID: "7e7951b7-f9d2-4e34-a4ec-c64753ec9a26"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 04:05:31 crc kubenswrapper[4703]: I1011 04:05:31.416435 4703 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7e7951b7-f9d2-4e34-a4ec-c64753ec9a26-util\") on node \"crc\" DevicePath \"\"" Oct 11 04:05:31 crc kubenswrapper[4703]: I1011 04:05:31.560334 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/10f57ec60de4df58ed39de93369cc80174e5ad08476bc9cf01944ad890zc2gm" event={"ID":"7e7951b7-f9d2-4e34-a4ec-c64753ec9a26","Type":"ContainerDied","Data":"ab6f6a0595be9f87375153dbaad7d82730c0fb6bb7c0fa4b025e0dbb16026d2c"} Oct 11 04:05:31 crc kubenswrapper[4703]: I1011 04:05:31.560390 4703 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ab6f6a0595be9f87375153dbaad7d82730c0fb6bb7c0fa4b025e0dbb16026d2c" Oct 11 04:05:31 crc kubenswrapper[4703]: I1011 04:05:31.560396 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/10f57ec60de4df58ed39de93369cc80174e5ad08476bc9cf01944ad890zc2gm" Oct 11 04:05:38 crc kubenswrapper[4703]: I1011 04:05:38.065562 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-86ffc69b7-nwnvb"] Oct 11 04:05:38 crc kubenswrapper[4703]: E1011 04:05:38.066505 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e7951b7-f9d2-4e34-a4ec-c64753ec9a26" containerName="pull" Oct 11 04:05:38 crc kubenswrapper[4703]: I1011 04:05:38.066528 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e7951b7-f9d2-4e34-a4ec-c64753ec9a26" containerName="pull" Oct 11 04:05:38 crc kubenswrapper[4703]: E1011 04:05:38.066558 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e7951b7-f9d2-4e34-a4ec-c64753ec9a26" containerName="util" Oct 11 04:05:38 crc kubenswrapper[4703]: I1011 04:05:38.066569 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e7951b7-f9d2-4e34-a4ec-c64753ec9a26" containerName="util" Oct 11 04:05:38 crc kubenswrapper[4703]: E1011 04:05:38.066587 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e7951b7-f9d2-4e34-a4ec-c64753ec9a26" containerName="extract" Oct 11 04:05:38 crc kubenswrapper[4703]: I1011 04:05:38.066598 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e7951b7-f9d2-4e34-a4ec-c64753ec9a26" containerName="extract" Oct 11 04:05:38 crc kubenswrapper[4703]: I1011 04:05:38.066808 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e7951b7-f9d2-4e34-a4ec-c64753ec9a26" containerName="extract" Oct 11 04:05:38 crc kubenswrapper[4703]: I1011 04:05:38.067794 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-86ffc69b7-nwnvb" Oct 11 04:05:38 crc kubenswrapper[4703]: I1011 04:05:38.069622 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Oct 11 04:05:38 crc kubenswrapper[4703]: I1011 04:05:38.069670 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-service-cert" Oct 11 04:05:38 crc kubenswrapper[4703]: I1011 04:05:38.070003 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-hxmg9" Oct 11 04:05:38 crc kubenswrapper[4703]: I1011 04:05:38.083388 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-86ffc69b7-nwnvb"] Oct 11 04:05:38 crc kubenswrapper[4703]: I1011 04:05:38.111357 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v6vr2\" (UniqueName: \"kubernetes.io/projected/6d215033-9dda-4e97-8ba5-72dd0ecea5f9-kube-api-access-v6vr2\") pod \"mariadb-operator-controller-manager-86ffc69b7-nwnvb\" (UID: \"6d215033-9dda-4e97-8ba5-72dd0ecea5f9\") " pod="openstack-operators/mariadb-operator-controller-manager-86ffc69b7-nwnvb" Oct 11 04:05:38 crc kubenswrapper[4703]: I1011 04:05:38.111433 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/6d215033-9dda-4e97-8ba5-72dd0ecea5f9-webhook-cert\") pod \"mariadb-operator-controller-manager-86ffc69b7-nwnvb\" (UID: \"6d215033-9dda-4e97-8ba5-72dd0ecea5f9\") " pod="openstack-operators/mariadb-operator-controller-manager-86ffc69b7-nwnvb" Oct 11 04:05:38 crc kubenswrapper[4703]: I1011 04:05:38.111533 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/6d215033-9dda-4e97-8ba5-72dd0ecea5f9-apiservice-cert\") pod \"mariadb-operator-controller-manager-86ffc69b7-nwnvb\" (UID: \"6d215033-9dda-4e97-8ba5-72dd0ecea5f9\") " pod="openstack-operators/mariadb-operator-controller-manager-86ffc69b7-nwnvb" Oct 11 04:05:38 crc kubenswrapper[4703]: I1011 04:05:38.212669 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/6d215033-9dda-4e97-8ba5-72dd0ecea5f9-apiservice-cert\") pod \"mariadb-operator-controller-manager-86ffc69b7-nwnvb\" (UID: \"6d215033-9dda-4e97-8ba5-72dd0ecea5f9\") " pod="openstack-operators/mariadb-operator-controller-manager-86ffc69b7-nwnvb" Oct 11 04:05:38 crc kubenswrapper[4703]: I1011 04:05:38.212766 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v6vr2\" (UniqueName: \"kubernetes.io/projected/6d215033-9dda-4e97-8ba5-72dd0ecea5f9-kube-api-access-v6vr2\") pod \"mariadb-operator-controller-manager-86ffc69b7-nwnvb\" (UID: \"6d215033-9dda-4e97-8ba5-72dd0ecea5f9\") " pod="openstack-operators/mariadb-operator-controller-manager-86ffc69b7-nwnvb" Oct 11 04:05:38 crc kubenswrapper[4703]: I1011 04:05:38.212793 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/6d215033-9dda-4e97-8ba5-72dd0ecea5f9-webhook-cert\") pod \"mariadb-operator-controller-manager-86ffc69b7-nwnvb\" (UID: \"6d215033-9dda-4e97-8ba5-72dd0ecea5f9\") " pod="openstack-operators/mariadb-operator-controller-manager-86ffc69b7-nwnvb" Oct 11 04:05:38 crc kubenswrapper[4703]: I1011 04:05:38.219820 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/6d215033-9dda-4e97-8ba5-72dd0ecea5f9-webhook-cert\") pod \"mariadb-operator-controller-manager-86ffc69b7-nwnvb\" (UID: \"6d215033-9dda-4e97-8ba5-72dd0ecea5f9\") " pod="openstack-operators/mariadb-operator-controller-manager-86ffc69b7-nwnvb" Oct 11 04:05:38 crc kubenswrapper[4703]: I1011 04:05:38.226338 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/6d215033-9dda-4e97-8ba5-72dd0ecea5f9-apiservice-cert\") pod \"mariadb-operator-controller-manager-86ffc69b7-nwnvb\" (UID: \"6d215033-9dda-4e97-8ba5-72dd0ecea5f9\") " pod="openstack-operators/mariadb-operator-controller-manager-86ffc69b7-nwnvb" Oct 11 04:05:38 crc kubenswrapper[4703]: I1011 04:05:38.244019 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v6vr2\" (UniqueName: \"kubernetes.io/projected/6d215033-9dda-4e97-8ba5-72dd0ecea5f9-kube-api-access-v6vr2\") pod \"mariadb-operator-controller-manager-86ffc69b7-nwnvb\" (UID: \"6d215033-9dda-4e97-8ba5-72dd0ecea5f9\") " pod="openstack-operators/mariadb-operator-controller-manager-86ffc69b7-nwnvb" Oct 11 04:05:38 crc kubenswrapper[4703]: I1011 04:05:38.385362 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-86ffc69b7-nwnvb" Oct 11 04:05:38 crc kubenswrapper[4703]: I1011 04:05:38.604838 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-86ffc69b7-nwnvb"] Oct 11 04:05:39 crc kubenswrapper[4703]: I1011 04:05:39.615873 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-86ffc69b7-nwnvb" event={"ID":"6d215033-9dda-4e97-8ba5-72dd0ecea5f9","Type":"ContainerStarted","Data":"9741d38e92014f336eb81d38a0b23733f987aa60248a738d7bbd4675f361ed41"} Oct 11 04:05:42 crc kubenswrapper[4703]: I1011 04:05:42.642155 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-86ffc69b7-nwnvb" event={"ID":"6d215033-9dda-4e97-8ba5-72dd0ecea5f9","Type":"ContainerStarted","Data":"941c6b424ae7e1010e3f7970729269a25dba59f14f92dab37808dfad551c8387"} Oct 11 04:05:44 crc kubenswrapper[4703]: I1011 04:05:44.660309 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-86ffc69b7-nwnvb" event={"ID":"6d215033-9dda-4e97-8ba5-72dd0ecea5f9","Type":"ContainerStarted","Data":"98ac97dd66ae933a1b029299c5a1d9ac77972d71105fce50d652e02b8bba69c5"} Oct 11 04:05:44 crc kubenswrapper[4703]: I1011 04:05:44.660969 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-86ffc69b7-nwnvb" Oct 11 04:05:44 crc kubenswrapper[4703]: I1011 04:05:44.687897 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-86ffc69b7-nwnvb" podStartSLOduration=1.023468672 podStartE2EDuration="6.687873337s" podCreationTimestamp="2025-10-11 04:05:38 +0000 UTC" firstStartedPulling="2025-10-11 04:05:38.613951359 +0000 UTC m=+629.824433281" lastFinishedPulling="2025-10-11 04:05:44.278356024 +0000 UTC m=+635.488837946" observedRunningTime="2025-10-11 04:05:44.68235894 +0000 UTC m=+635.892840922" watchObservedRunningTime="2025-10-11 04:05:44.687873337 +0000 UTC m=+635.898355299" Oct 11 04:05:48 crc kubenswrapper[4703]: I1011 04:05:48.392576 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-86ffc69b7-nwnvb" Oct 11 04:05:51 crc kubenswrapper[4703]: I1011 04:05:51.664652 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-index-vmm7z"] Oct 11 04:05:51 crc kubenswrapper[4703]: I1011 04:05:51.665764 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-index-vmm7z" Oct 11 04:05:51 crc kubenswrapper[4703]: I1011 04:05:51.670452 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-index-dockercfg-p5qwq" Oct 11 04:05:51 crc kubenswrapper[4703]: I1011 04:05:51.683309 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-index-vmm7z"] Oct 11 04:05:51 crc kubenswrapper[4703]: I1011 04:05:51.795151 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tv9dx\" (UniqueName: \"kubernetes.io/projected/94b8f614-7bdc-4b98-9a5a-fccb2775b533-kube-api-access-tv9dx\") pod \"infra-operator-index-vmm7z\" (UID: \"94b8f614-7bdc-4b98-9a5a-fccb2775b533\") " pod="openstack-operators/infra-operator-index-vmm7z" Oct 11 04:05:51 crc kubenswrapper[4703]: I1011 04:05:51.896409 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tv9dx\" (UniqueName: \"kubernetes.io/projected/94b8f614-7bdc-4b98-9a5a-fccb2775b533-kube-api-access-tv9dx\") pod \"infra-operator-index-vmm7z\" (UID: \"94b8f614-7bdc-4b98-9a5a-fccb2775b533\") " pod="openstack-operators/infra-operator-index-vmm7z" Oct 11 04:05:51 crc kubenswrapper[4703]: I1011 04:05:51.927826 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tv9dx\" (UniqueName: \"kubernetes.io/projected/94b8f614-7bdc-4b98-9a5a-fccb2775b533-kube-api-access-tv9dx\") pod \"infra-operator-index-vmm7z\" (UID: \"94b8f614-7bdc-4b98-9a5a-fccb2775b533\") " pod="openstack-operators/infra-operator-index-vmm7z" Oct 11 04:05:51 crc kubenswrapper[4703]: I1011 04:05:51.990666 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-index-vmm7z" Oct 11 04:05:52 crc kubenswrapper[4703]: I1011 04:05:52.348510 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-index-vmm7z"] Oct 11 04:05:52 crc kubenswrapper[4703]: I1011 04:05:52.722513 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-index-vmm7z" event={"ID":"94b8f614-7bdc-4b98-9a5a-fccb2775b533","Type":"ContainerStarted","Data":"f27539901c844cc0af88b14105bf5088e3ec7b46b3e94af5d1f2cd3c5fed6709"} Oct 11 04:05:53 crc kubenswrapper[4703]: I1011 04:05:53.727767 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-index-vmm7z" event={"ID":"94b8f614-7bdc-4b98-9a5a-fccb2775b533","Type":"ContainerStarted","Data":"22aed642d327f900825fa674b64b7d90dd3fa5bbca25522f9c64e683289d619a"} Oct 11 04:05:53 crc kubenswrapper[4703]: I1011 04:05:53.744708 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-index-vmm7z" podStartSLOduration=1.6667318290000002 podStartE2EDuration="2.744689036s" podCreationTimestamp="2025-10-11 04:05:51 +0000 UTC" firstStartedPulling="2025-10-11 04:05:52.352768198 +0000 UTC m=+643.563250110" lastFinishedPulling="2025-10-11 04:05:53.430725385 +0000 UTC m=+644.641207317" observedRunningTime="2025-10-11 04:05:53.740131415 +0000 UTC m=+644.950613347" watchObservedRunningTime="2025-10-11 04:05:53.744689036 +0000 UTC m=+644.955170958" Oct 11 04:05:56 crc kubenswrapper[4703]: I1011 04:05:56.954458 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-9lb9d"] Oct 11 04:05:56 crc kubenswrapper[4703]: I1011 04:05:56.955097 4703 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-9lb9d" podUID="ff55314a-7f2f-42c9-9723-2fdaad791959" containerName="controller-manager" containerID="cri-o://d4eaf64f7a44f58a84ad5c96c6492d13e9c9ea5d167e03ee91103a7634f9af54" gracePeriod=30 Oct 11 04:05:56 crc kubenswrapper[4703]: I1011 04:05:56.985721 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-rz7sl"] Oct 11 04:05:56 crc kubenswrapper[4703]: I1011 04:05:56.985973 4703 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rz7sl" podUID="9425edee-98a4-4086-8d31-28478469501b" containerName="route-controller-manager" containerID="cri-o://00f1719cbb9cac6b8a28acdf95fc99575f9b12484244c0ecd5171cfafcfa8e86" gracePeriod=30 Oct 11 04:05:57 crc kubenswrapper[4703]: I1011 04:05:57.369095 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-9lb9d" Oct 11 04:05:57 crc kubenswrapper[4703]: I1011 04:05:57.374735 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rz7sl" Oct 11 04:05:57 crc kubenswrapper[4703]: I1011 04:05:57.478748 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9425edee-98a4-4086-8d31-28478469501b-client-ca\") pod \"9425edee-98a4-4086-8d31-28478469501b\" (UID: \"9425edee-98a4-4086-8d31-28478469501b\") " Oct 11 04:05:57 crc kubenswrapper[4703]: I1011 04:05:57.478802 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ff55314a-7f2f-42c9-9723-2fdaad791959-config\") pod \"ff55314a-7f2f-42c9-9723-2fdaad791959\" (UID: \"ff55314a-7f2f-42c9-9723-2fdaad791959\") " Oct 11 04:05:57 crc kubenswrapper[4703]: I1011 04:05:57.478840 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ff55314a-7f2f-42c9-9723-2fdaad791959-client-ca\") pod \"ff55314a-7f2f-42c9-9723-2fdaad791959\" (UID: \"ff55314a-7f2f-42c9-9723-2fdaad791959\") " Oct 11 04:05:57 crc kubenswrapper[4703]: I1011 04:05:57.478866 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t8qr8\" (UniqueName: \"kubernetes.io/projected/ff55314a-7f2f-42c9-9723-2fdaad791959-kube-api-access-t8qr8\") pod \"ff55314a-7f2f-42c9-9723-2fdaad791959\" (UID: \"ff55314a-7f2f-42c9-9723-2fdaad791959\") " Oct 11 04:05:57 crc kubenswrapper[4703]: I1011 04:05:57.478941 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9425edee-98a4-4086-8d31-28478469501b-serving-cert\") pod \"9425edee-98a4-4086-8d31-28478469501b\" (UID: \"9425edee-98a4-4086-8d31-28478469501b\") " Oct 11 04:05:57 crc kubenswrapper[4703]: I1011 04:05:57.478956 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4ffhd\" (UniqueName: \"kubernetes.io/projected/9425edee-98a4-4086-8d31-28478469501b-kube-api-access-4ffhd\") pod \"9425edee-98a4-4086-8d31-28478469501b\" (UID: \"9425edee-98a4-4086-8d31-28478469501b\") " Oct 11 04:05:57 crc kubenswrapper[4703]: I1011 04:05:57.478990 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ff55314a-7f2f-42c9-9723-2fdaad791959-serving-cert\") pod \"ff55314a-7f2f-42c9-9723-2fdaad791959\" (UID: \"ff55314a-7f2f-42c9-9723-2fdaad791959\") " Oct 11 04:05:57 crc kubenswrapper[4703]: I1011 04:05:57.479029 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9425edee-98a4-4086-8d31-28478469501b-config\") pod \"9425edee-98a4-4086-8d31-28478469501b\" (UID: \"9425edee-98a4-4086-8d31-28478469501b\") " Oct 11 04:05:57 crc kubenswrapper[4703]: I1011 04:05:57.479049 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ff55314a-7f2f-42c9-9723-2fdaad791959-proxy-ca-bundles\") pod \"ff55314a-7f2f-42c9-9723-2fdaad791959\" (UID: \"ff55314a-7f2f-42c9-9723-2fdaad791959\") " Oct 11 04:05:57 crc kubenswrapper[4703]: I1011 04:05:57.479452 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9425edee-98a4-4086-8d31-28478469501b-client-ca" (OuterVolumeSpecName: "client-ca") pod "9425edee-98a4-4086-8d31-28478469501b" (UID: "9425edee-98a4-4086-8d31-28478469501b"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 04:05:57 crc kubenswrapper[4703]: I1011 04:05:57.479975 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ff55314a-7f2f-42c9-9723-2fdaad791959-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "ff55314a-7f2f-42c9-9723-2fdaad791959" (UID: "ff55314a-7f2f-42c9-9723-2fdaad791959"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 04:05:57 crc kubenswrapper[4703]: I1011 04:05:57.480344 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ff55314a-7f2f-42c9-9723-2fdaad791959-client-ca" (OuterVolumeSpecName: "client-ca") pod "ff55314a-7f2f-42c9-9723-2fdaad791959" (UID: "ff55314a-7f2f-42c9-9723-2fdaad791959"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 04:05:57 crc kubenswrapper[4703]: I1011 04:05:57.480358 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ff55314a-7f2f-42c9-9723-2fdaad791959-config" (OuterVolumeSpecName: "config") pod "ff55314a-7f2f-42c9-9723-2fdaad791959" (UID: "ff55314a-7f2f-42c9-9723-2fdaad791959"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 04:05:57 crc kubenswrapper[4703]: I1011 04:05:57.480566 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9425edee-98a4-4086-8d31-28478469501b-config" (OuterVolumeSpecName: "config") pod "9425edee-98a4-4086-8d31-28478469501b" (UID: "9425edee-98a4-4086-8d31-28478469501b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 04:05:57 crc kubenswrapper[4703]: I1011 04:05:57.485083 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9425edee-98a4-4086-8d31-28478469501b-kube-api-access-4ffhd" (OuterVolumeSpecName: "kube-api-access-4ffhd") pod "9425edee-98a4-4086-8d31-28478469501b" (UID: "9425edee-98a4-4086-8d31-28478469501b"). InnerVolumeSpecName "kube-api-access-4ffhd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 04:05:57 crc kubenswrapper[4703]: I1011 04:05:57.485338 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff55314a-7f2f-42c9-9723-2fdaad791959-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "ff55314a-7f2f-42c9-9723-2fdaad791959" (UID: "ff55314a-7f2f-42c9-9723-2fdaad791959"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 04:05:57 crc kubenswrapper[4703]: I1011 04:05:57.486077 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9425edee-98a4-4086-8d31-28478469501b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9425edee-98a4-4086-8d31-28478469501b" (UID: "9425edee-98a4-4086-8d31-28478469501b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 04:05:57 crc kubenswrapper[4703]: I1011 04:05:57.486569 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff55314a-7f2f-42c9-9723-2fdaad791959-kube-api-access-t8qr8" (OuterVolumeSpecName: "kube-api-access-t8qr8") pod "ff55314a-7f2f-42c9-9723-2fdaad791959" (UID: "ff55314a-7f2f-42c9-9723-2fdaad791959"). InnerVolumeSpecName "kube-api-access-t8qr8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 04:05:57 crc kubenswrapper[4703]: I1011 04:05:57.581202 4703 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ff55314a-7f2f-42c9-9723-2fdaad791959-client-ca\") on node \"crc\" DevicePath \"\"" Oct 11 04:05:57 crc kubenswrapper[4703]: I1011 04:05:57.581242 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t8qr8\" (UniqueName: \"kubernetes.io/projected/ff55314a-7f2f-42c9-9723-2fdaad791959-kube-api-access-t8qr8\") on node \"crc\" DevicePath \"\"" Oct 11 04:05:57 crc kubenswrapper[4703]: I1011 04:05:57.581257 4703 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9425edee-98a4-4086-8d31-28478469501b-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 11 04:05:57 crc kubenswrapper[4703]: I1011 04:05:57.581374 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4ffhd\" (UniqueName: \"kubernetes.io/projected/9425edee-98a4-4086-8d31-28478469501b-kube-api-access-4ffhd\") on node \"crc\" DevicePath \"\"" Oct 11 04:05:57 crc kubenswrapper[4703]: I1011 04:05:57.581494 4703 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ff55314a-7f2f-42c9-9723-2fdaad791959-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 11 04:05:57 crc kubenswrapper[4703]: I1011 04:05:57.581512 4703 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9425edee-98a4-4086-8d31-28478469501b-config\") on node \"crc\" DevicePath \"\"" Oct 11 04:05:57 crc kubenswrapper[4703]: I1011 04:05:57.581524 4703 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ff55314a-7f2f-42c9-9723-2fdaad791959-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Oct 11 04:05:57 crc kubenswrapper[4703]: I1011 04:05:57.581536 4703 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9425edee-98a4-4086-8d31-28478469501b-client-ca\") on node \"crc\" DevicePath \"\"" Oct 11 04:05:57 crc kubenswrapper[4703]: I1011 04:05:57.581548 4703 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ff55314a-7f2f-42c9-9723-2fdaad791959-config\") on node \"crc\" DevicePath \"\"" Oct 11 04:05:57 crc kubenswrapper[4703]: I1011 04:05:57.753912 4703 generic.go:334] "Generic (PLEG): container finished" podID="ff55314a-7f2f-42c9-9723-2fdaad791959" containerID="d4eaf64f7a44f58a84ad5c96c6492d13e9c9ea5d167e03ee91103a7634f9af54" exitCode=0 Oct 11 04:05:57 crc kubenswrapper[4703]: I1011 04:05:57.753950 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-9lb9d" Oct 11 04:05:57 crc kubenswrapper[4703]: I1011 04:05:57.753968 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-9lb9d" event={"ID":"ff55314a-7f2f-42c9-9723-2fdaad791959","Type":"ContainerDied","Data":"d4eaf64f7a44f58a84ad5c96c6492d13e9c9ea5d167e03ee91103a7634f9af54"} Oct 11 04:05:57 crc kubenswrapper[4703]: I1011 04:05:57.754404 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-9lb9d" event={"ID":"ff55314a-7f2f-42c9-9723-2fdaad791959","Type":"ContainerDied","Data":"5e33f3193171c7eab23141033858cca2b4f7c6facbc936d19c5dfd714d09d480"} Oct 11 04:05:57 crc kubenswrapper[4703]: I1011 04:05:57.754426 4703 scope.go:117] "RemoveContainer" containerID="d4eaf64f7a44f58a84ad5c96c6492d13e9c9ea5d167e03ee91103a7634f9af54" Oct 11 04:05:57 crc kubenswrapper[4703]: I1011 04:05:57.758523 4703 generic.go:334] "Generic (PLEG): container finished" podID="9425edee-98a4-4086-8d31-28478469501b" containerID="00f1719cbb9cac6b8a28acdf95fc99575f9b12484244c0ecd5171cfafcfa8e86" exitCode=0 Oct 11 04:05:57 crc kubenswrapper[4703]: I1011 04:05:57.758595 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rz7sl" event={"ID":"9425edee-98a4-4086-8d31-28478469501b","Type":"ContainerDied","Data":"00f1719cbb9cac6b8a28acdf95fc99575f9b12484244c0ecd5171cfafcfa8e86"} Oct 11 04:05:57 crc kubenswrapper[4703]: I1011 04:05:57.758641 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rz7sl" event={"ID":"9425edee-98a4-4086-8d31-28478469501b","Type":"ContainerDied","Data":"880ce472699b61afb84f98f7edf57a941e8e557d202791e27418818d6e418c03"} Oct 11 04:05:57 crc kubenswrapper[4703]: I1011 04:05:57.758612 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rz7sl" Oct 11 04:05:57 crc kubenswrapper[4703]: I1011 04:05:57.776869 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-9lb9d"] Oct 11 04:05:57 crc kubenswrapper[4703]: I1011 04:05:57.782584 4703 scope.go:117] "RemoveContainer" containerID="d4eaf64f7a44f58a84ad5c96c6492d13e9c9ea5d167e03ee91103a7634f9af54" Oct 11 04:05:57 crc kubenswrapper[4703]: E1011 04:05:57.783098 4703 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d4eaf64f7a44f58a84ad5c96c6492d13e9c9ea5d167e03ee91103a7634f9af54\": container with ID starting with d4eaf64f7a44f58a84ad5c96c6492d13e9c9ea5d167e03ee91103a7634f9af54 not found: ID does not exist" containerID="d4eaf64f7a44f58a84ad5c96c6492d13e9c9ea5d167e03ee91103a7634f9af54" Oct 11 04:05:57 crc kubenswrapper[4703]: I1011 04:05:57.783141 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d4eaf64f7a44f58a84ad5c96c6492d13e9c9ea5d167e03ee91103a7634f9af54"} err="failed to get container status \"d4eaf64f7a44f58a84ad5c96c6492d13e9c9ea5d167e03ee91103a7634f9af54\": rpc error: code = NotFound desc = could not find container \"d4eaf64f7a44f58a84ad5c96c6492d13e9c9ea5d167e03ee91103a7634f9af54\": container with ID starting with d4eaf64f7a44f58a84ad5c96c6492d13e9c9ea5d167e03ee91103a7634f9af54 not found: ID does not exist" Oct 11 04:05:57 crc kubenswrapper[4703]: I1011 04:05:57.783171 4703 scope.go:117] "RemoveContainer" containerID="00f1719cbb9cac6b8a28acdf95fc99575f9b12484244c0ecd5171cfafcfa8e86" Oct 11 04:05:57 crc kubenswrapper[4703]: I1011 04:05:57.785003 4703 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-9lb9d"] Oct 11 04:05:57 crc kubenswrapper[4703]: I1011 04:05:57.799264 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-rz7sl"] Oct 11 04:05:57 crc kubenswrapper[4703]: I1011 04:05:57.806449 4703 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-rz7sl"] Oct 11 04:05:57 crc kubenswrapper[4703]: I1011 04:05:57.810605 4703 scope.go:117] "RemoveContainer" containerID="00f1719cbb9cac6b8a28acdf95fc99575f9b12484244c0ecd5171cfafcfa8e86" Oct 11 04:05:57 crc kubenswrapper[4703]: E1011 04:05:57.811353 4703 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"00f1719cbb9cac6b8a28acdf95fc99575f9b12484244c0ecd5171cfafcfa8e86\": container with ID starting with 00f1719cbb9cac6b8a28acdf95fc99575f9b12484244c0ecd5171cfafcfa8e86 not found: ID does not exist" containerID="00f1719cbb9cac6b8a28acdf95fc99575f9b12484244c0ecd5171cfafcfa8e86" Oct 11 04:05:57 crc kubenswrapper[4703]: I1011 04:05:57.811488 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"00f1719cbb9cac6b8a28acdf95fc99575f9b12484244c0ecd5171cfafcfa8e86"} err="failed to get container status \"00f1719cbb9cac6b8a28acdf95fc99575f9b12484244c0ecd5171cfafcfa8e86\": rpc error: code = NotFound desc = could not find container \"00f1719cbb9cac6b8a28acdf95fc99575f9b12484244c0ecd5171cfafcfa8e86\": container with ID starting with 00f1719cbb9cac6b8a28acdf95fc99575f9b12484244c0ecd5171cfafcfa8e86 not found: ID does not exist" Oct 11 04:05:58 crc kubenswrapper[4703]: I1011 04:05:58.027922 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-5685dcbdbc-4xgmt"] Oct 11 04:05:58 crc kubenswrapper[4703]: E1011 04:05:58.028192 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff55314a-7f2f-42c9-9723-2fdaad791959" containerName="controller-manager" Oct 11 04:05:58 crc kubenswrapper[4703]: I1011 04:05:58.028207 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff55314a-7f2f-42c9-9723-2fdaad791959" containerName="controller-manager" Oct 11 04:05:58 crc kubenswrapper[4703]: E1011 04:05:58.028223 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9425edee-98a4-4086-8d31-28478469501b" containerName="route-controller-manager" Oct 11 04:05:58 crc kubenswrapper[4703]: I1011 04:05:58.028231 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="9425edee-98a4-4086-8d31-28478469501b" containerName="route-controller-manager" Oct 11 04:05:58 crc kubenswrapper[4703]: I1011 04:05:58.028351 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="9425edee-98a4-4086-8d31-28478469501b" containerName="route-controller-manager" Oct 11 04:05:58 crc kubenswrapper[4703]: I1011 04:05:58.028370 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff55314a-7f2f-42c9-9723-2fdaad791959" containerName="controller-manager" Oct 11 04:05:58 crc kubenswrapper[4703]: I1011 04:05:58.028815 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5685dcbdbc-4xgmt" Oct 11 04:05:58 crc kubenswrapper[4703]: I1011 04:05:58.031852 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Oct 11 04:05:58 crc kubenswrapper[4703]: I1011 04:05:58.032217 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Oct 11 04:05:58 crc kubenswrapper[4703]: I1011 04:05:58.032444 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Oct 11 04:05:58 crc kubenswrapper[4703]: I1011 04:05:58.032662 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Oct 11 04:05:58 crc kubenswrapper[4703]: I1011 04:05:58.034396 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Oct 11 04:05:58 crc kubenswrapper[4703]: I1011 04:05:58.035359 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Oct 11 04:05:58 crc kubenswrapper[4703]: I1011 04:05:58.046717 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Oct 11 04:05:58 crc kubenswrapper[4703]: I1011 04:05:58.053241 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5685dcbdbc-4xgmt"] Oct 11 04:05:58 crc kubenswrapper[4703]: I1011 04:05:58.086712 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/034898e6-7c3d-4144-9f19-a084d99e169f-proxy-ca-bundles\") pod \"controller-manager-5685dcbdbc-4xgmt\" (UID: \"034898e6-7c3d-4144-9f19-a084d99e169f\") " pod="openshift-controller-manager/controller-manager-5685dcbdbc-4xgmt" Oct 11 04:05:58 crc kubenswrapper[4703]: I1011 04:05:58.086784 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/034898e6-7c3d-4144-9f19-a084d99e169f-serving-cert\") pod \"controller-manager-5685dcbdbc-4xgmt\" (UID: \"034898e6-7c3d-4144-9f19-a084d99e169f\") " pod="openshift-controller-manager/controller-manager-5685dcbdbc-4xgmt" Oct 11 04:05:58 crc kubenswrapper[4703]: I1011 04:05:58.086835 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/034898e6-7c3d-4144-9f19-a084d99e169f-config\") pod \"controller-manager-5685dcbdbc-4xgmt\" (UID: \"034898e6-7c3d-4144-9f19-a084d99e169f\") " pod="openshift-controller-manager/controller-manager-5685dcbdbc-4xgmt" Oct 11 04:05:58 crc kubenswrapper[4703]: I1011 04:05:58.086871 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xkv7b\" (UniqueName: \"kubernetes.io/projected/034898e6-7c3d-4144-9f19-a084d99e169f-kube-api-access-xkv7b\") pod \"controller-manager-5685dcbdbc-4xgmt\" (UID: \"034898e6-7c3d-4144-9f19-a084d99e169f\") " pod="openshift-controller-manager/controller-manager-5685dcbdbc-4xgmt" Oct 11 04:05:58 crc kubenswrapper[4703]: I1011 04:05:58.086982 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/034898e6-7c3d-4144-9f19-a084d99e169f-client-ca\") pod \"controller-manager-5685dcbdbc-4xgmt\" (UID: \"034898e6-7c3d-4144-9f19-a084d99e169f\") " pod="openshift-controller-manager/controller-manager-5685dcbdbc-4xgmt" Oct 11 04:05:58 crc kubenswrapper[4703]: I1011 04:05:58.187657 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/034898e6-7c3d-4144-9f19-a084d99e169f-client-ca\") pod \"controller-manager-5685dcbdbc-4xgmt\" (UID: \"034898e6-7c3d-4144-9f19-a084d99e169f\") " pod="openshift-controller-manager/controller-manager-5685dcbdbc-4xgmt" Oct 11 04:05:58 crc kubenswrapper[4703]: I1011 04:05:58.187736 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/034898e6-7c3d-4144-9f19-a084d99e169f-proxy-ca-bundles\") pod \"controller-manager-5685dcbdbc-4xgmt\" (UID: \"034898e6-7c3d-4144-9f19-a084d99e169f\") " pod="openshift-controller-manager/controller-manager-5685dcbdbc-4xgmt" Oct 11 04:05:58 crc kubenswrapper[4703]: I1011 04:05:58.187768 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/034898e6-7c3d-4144-9f19-a084d99e169f-serving-cert\") pod \"controller-manager-5685dcbdbc-4xgmt\" (UID: \"034898e6-7c3d-4144-9f19-a084d99e169f\") " pod="openshift-controller-manager/controller-manager-5685dcbdbc-4xgmt" Oct 11 04:05:58 crc kubenswrapper[4703]: I1011 04:05:58.187794 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/034898e6-7c3d-4144-9f19-a084d99e169f-config\") pod \"controller-manager-5685dcbdbc-4xgmt\" (UID: \"034898e6-7c3d-4144-9f19-a084d99e169f\") " pod="openshift-controller-manager/controller-manager-5685dcbdbc-4xgmt" Oct 11 04:05:58 crc kubenswrapper[4703]: I1011 04:05:58.187822 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xkv7b\" (UniqueName: \"kubernetes.io/projected/034898e6-7c3d-4144-9f19-a084d99e169f-kube-api-access-xkv7b\") pod \"controller-manager-5685dcbdbc-4xgmt\" (UID: \"034898e6-7c3d-4144-9f19-a084d99e169f\") " pod="openshift-controller-manager/controller-manager-5685dcbdbc-4xgmt" Oct 11 04:05:58 crc kubenswrapper[4703]: I1011 04:05:58.188753 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/034898e6-7c3d-4144-9f19-a084d99e169f-client-ca\") pod \"controller-manager-5685dcbdbc-4xgmt\" (UID: \"034898e6-7c3d-4144-9f19-a084d99e169f\") " pod="openshift-controller-manager/controller-manager-5685dcbdbc-4xgmt" Oct 11 04:05:58 crc kubenswrapper[4703]: I1011 04:05:58.189317 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/034898e6-7c3d-4144-9f19-a084d99e169f-config\") pod \"controller-manager-5685dcbdbc-4xgmt\" (UID: \"034898e6-7c3d-4144-9f19-a084d99e169f\") " pod="openshift-controller-manager/controller-manager-5685dcbdbc-4xgmt" Oct 11 04:05:58 crc kubenswrapper[4703]: I1011 04:05:58.189381 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/034898e6-7c3d-4144-9f19-a084d99e169f-proxy-ca-bundles\") pod \"controller-manager-5685dcbdbc-4xgmt\" (UID: \"034898e6-7c3d-4144-9f19-a084d99e169f\") " pod="openshift-controller-manager/controller-manager-5685dcbdbc-4xgmt" Oct 11 04:05:58 crc kubenswrapper[4703]: I1011 04:05:58.192118 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/034898e6-7c3d-4144-9f19-a084d99e169f-serving-cert\") pod \"controller-manager-5685dcbdbc-4xgmt\" (UID: \"034898e6-7c3d-4144-9f19-a084d99e169f\") " pod="openshift-controller-manager/controller-manager-5685dcbdbc-4xgmt" Oct 11 04:05:58 crc kubenswrapper[4703]: I1011 04:05:58.204545 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xkv7b\" (UniqueName: \"kubernetes.io/projected/034898e6-7c3d-4144-9f19-a084d99e169f-kube-api-access-xkv7b\") pod \"controller-manager-5685dcbdbc-4xgmt\" (UID: \"034898e6-7c3d-4144-9f19-a084d99e169f\") " pod="openshift-controller-manager/controller-manager-5685dcbdbc-4xgmt" Oct 11 04:05:58 crc kubenswrapper[4703]: I1011 04:05:58.293997 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5685dcbdbc-4xgmt"] Oct 11 04:05:58 crc kubenswrapper[4703]: I1011 04:05:58.294386 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5685dcbdbc-4xgmt" Oct 11 04:05:58 crc kubenswrapper[4703]: I1011 04:05:58.346983 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-96f4d98db-q6zd9"] Oct 11 04:05:58 crc kubenswrapper[4703]: I1011 04:05:58.347835 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-96f4d98db-q6zd9" Oct 11 04:05:58 crc kubenswrapper[4703]: I1011 04:05:58.349495 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Oct 11 04:05:58 crc kubenswrapper[4703]: I1011 04:05:58.351170 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Oct 11 04:05:58 crc kubenswrapper[4703]: I1011 04:05:58.351596 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Oct 11 04:05:58 crc kubenswrapper[4703]: I1011 04:05:58.351756 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Oct 11 04:05:58 crc kubenswrapper[4703]: I1011 04:05:58.351884 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Oct 11 04:05:58 crc kubenswrapper[4703]: I1011 04:05:58.352007 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Oct 11 04:05:58 crc kubenswrapper[4703]: I1011 04:05:58.364107 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-96f4d98db-q6zd9"] Oct 11 04:05:58 crc kubenswrapper[4703]: I1011 04:05:58.493594 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f1911fd1-3aa9-4d9d-8934-6506db6e01d2-client-ca\") pod \"route-controller-manager-96f4d98db-q6zd9\" (UID: \"f1911fd1-3aa9-4d9d-8934-6506db6e01d2\") " pod="openshift-route-controller-manager/route-controller-manager-96f4d98db-q6zd9" Oct 11 04:05:58 crc kubenswrapper[4703]: I1011 04:05:58.494108 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f1911fd1-3aa9-4d9d-8934-6506db6e01d2-config\") pod \"route-controller-manager-96f4d98db-q6zd9\" (UID: \"f1911fd1-3aa9-4d9d-8934-6506db6e01d2\") " pod="openshift-route-controller-manager/route-controller-manager-96f4d98db-q6zd9" Oct 11 04:05:58 crc kubenswrapper[4703]: I1011 04:05:58.494175 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f1911fd1-3aa9-4d9d-8934-6506db6e01d2-serving-cert\") pod \"route-controller-manager-96f4d98db-q6zd9\" (UID: \"f1911fd1-3aa9-4d9d-8934-6506db6e01d2\") " pod="openshift-route-controller-manager/route-controller-manager-96f4d98db-q6zd9" Oct 11 04:05:58 crc kubenswrapper[4703]: I1011 04:05:58.494261 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b6w7r\" (UniqueName: \"kubernetes.io/projected/f1911fd1-3aa9-4d9d-8934-6506db6e01d2-kube-api-access-b6w7r\") pod \"route-controller-manager-96f4d98db-q6zd9\" (UID: \"f1911fd1-3aa9-4d9d-8934-6506db6e01d2\") " pod="openshift-route-controller-manager/route-controller-manager-96f4d98db-q6zd9" Oct 11 04:05:58 crc kubenswrapper[4703]: I1011 04:05:58.596292 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f1911fd1-3aa9-4d9d-8934-6506db6e01d2-client-ca\") pod \"route-controller-manager-96f4d98db-q6zd9\" (UID: \"f1911fd1-3aa9-4d9d-8934-6506db6e01d2\") " pod="openshift-route-controller-manager/route-controller-manager-96f4d98db-q6zd9" Oct 11 04:05:58 crc kubenswrapper[4703]: I1011 04:05:58.596355 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f1911fd1-3aa9-4d9d-8934-6506db6e01d2-config\") pod \"route-controller-manager-96f4d98db-q6zd9\" (UID: \"f1911fd1-3aa9-4d9d-8934-6506db6e01d2\") " pod="openshift-route-controller-manager/route-controller-manager-96f4d98db-q6zd9" Oct 11 04:05:58 crc kubenswrapper[4703]: I1011 04:05:58.597553 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f1911fd1-3aa9-4d9d-8934-6506db6e01d2-client-ca\") pod \"route-controller-manager-96f4d98db-q6zd9\" (UID: \"f1911fd1-3aa9-4d9d-8934-6506db6e01d2\") " pod="openshift-route-controller-manager/route-controller-manager-96f4d98db-q6zd9" Oct 11 04:05:58 crc kubenswrapper[4703]: I1011 04:05:58.598845 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f1911fd1-3aa9-4d9d-8934-6506db6e01d2-serving-cert\") pod \"route-controller-manager-96f4d98db-q6zd9\" (UID: \"f1911fd1-3aa9-4d9d-8934-6506db6e01d2\") " pod="openshift-route-controller-manager/route-controller-manager-96f4d98db-q6zd9" Oct 11 04:05:58 crc kubenswrapper[4703]: I1011 04:05:58.599224 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b6w7r\" (UniqueName: \"kubernetes.io/projected/f1911fd1-3aa9-4d9d-8934-6506db6e01d2-kube-api-access-b6w7r\") pod \"route-controller-manager-96f4d98db-q6zd9\" (UID: \"f1911fd1-3aa9-4d9d-8934-6506db6e01d2\") " pod="openshift-route-controller-manager/route-controller-manager-96f4d98db-q6zd9" Oct 11 04:05:58 crc kubenswrapper[4703]: I1011 04:05:58.601797 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f1911fd1-3aa9-4d9d-8934-6506db6e01d2-config\") pod \"route-controller-manager-96f4d98db-q6zd9\" (UID: \"f1911fd1-3aa9-4d9d-8934-6506db6e01d2\") " pod="openshift-route-controller-manager/route-controller-manager-96f4d98db-q6zd9" Oct 11 04:05:58 crc kubenswrapper[4703]: I1011 04:05:58.609331 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f1911fd1-3aa9-4d9d-8934-6506db6e01d2-serving-cert\") pod \"route-controller-manager-96f4d98db-q6zd9\" (UID: \"f1911fd1-3aa9-4d9d-8934-6506db6e01d2\") " pod="openshift-route-controller-manager/route-controller-manager-96f4d98db-q6zd9" Oct 11 04:05:58 crc kubenswrapper[4703]: I1011 04:05:58.620134 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b6w7r\" (UniqueName: \"kubernetes.io/projected/f1911fd1-3aa9-4d9d-8934-6506db6e01d2-kube-api-access-b6w7r\") pod \"route-controller-manager-96f4d98db-q6zd9\" (UID: \"f1911fd1-3aa9-4d9d-8934-6506db6e01d2\") " pod="openshift-route-controller-manager/route-controller-manager-96f4d98db-q6zd9" Oct 11 04:05:58 crc kubenswrapper[4703]: I1011 04:05:58.667839 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-96f4d98db-q6zd9" Oct 11 04:05:58 crc kubenswrapper[4703]: I1011 04:05:58.725491 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5685dcbdbc-4xgmt"] Oct 11 04:05:58 crc kubenswrapper[4703]: W1011 04:05:58.749756 4703 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod034898e6_7c3d_4144_9f19_a084d99e169f.slice/crio-142a3c6c06b5f9e1296fa3694b876436be773521a8e9ecbb743bcc8fc191f9f8 WatchSource:0}: Error finding container 142a3c6c06b5f9e1296fa3694b876436be773521a8e9ecbb743bcc8fc191f9f8: Status 404 returned error can't find the container with id 142a3c6c06b5f9e1296fa3694b876436be773521a8e9ecbb743bcc8fc191f9f8 Oct 11 04:05:58 crc kubenswrapper[4703]: I1011 04:05:58.817835 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5685dcbdbc-4xgmt" event={"ID":"034898e6-7c3d-4144-9f19-a084d99e169f","Type":"ContainerStarted","Data":"142a3c6c06b5f9e1296fa3694b876436be773521a8e9ecbb743bcc8fc191f9f8"} Oct 11 04:05:59 crc kubenswrapper[4703]: I1011 04:05:59.125365 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-96f4d98db-q6zd9"] Oct 11 04:05:59 crc kubenswrapper[4703]: I1011 04:05:59.541280 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9425edee-98a4-4086-8d31-28478469501b" path="/var/lib/kubelet/pods/9425edee-98a4-4086-8d31-28478469501b/volumes" Oct 11 04:05:59 crc kubenswrapper[4703]: I1011 04:05:59.542380 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ff55314a-7f2f-42c9-9723-2fdaad791959" path="/var/lib/kubelet/pods/ff55314a-7f2f-42c9-9723-2fdaad791959/volumes" Oct 11 04:05:59 crc kubenswrapper[4703]: I1011 04:05:59.827965 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-96f4d98db-q6zd9" event={"ID":"f1911fd1-3aa9-4d9d-8934-6506db6e01d2","Type":"ContainerStarted","Data":"927c2906c41f86dc04a8b64d205a5fdfc36d1939cf4c89358aef40a86461aaff"} Oct 11 04:05:59 crc kubenswrapper[4703]: I1011 04:05:59.828007 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-96f4d98db-q6zd9" event={"ID":"f1911fd1-3aa9-4d9d-8934-6506db6e01d2","Type":"ContainerStarted","Data":"02dea053dc90bf83c6f29f746a7c837c1a8367eae762f931471f700df9cd2349"} Oct 11 04:05:59 crc kubenswrapper[4703]: I1011 04:05:59.828674 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-96f4d98db-q6zd9" Oct 11 04:05:59 crc kubenswrapper[4703]: I1011 04:05:59.831405 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5685dcbdbc-4xgmt" event={"ID":"034898e6-7c3d-4144-9f19-a084d99e169f","Type":"ContainerStarted","Data":"e5207ea93df9aac978be1fde04fb718c2d36a1b369eb7725c7ccbfde77c38bfd"} Oct 11 04:05:59 crc kubenswrapper[4703]: I1011 04:05:59.831604 4703 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-5685dcbdbc-4xgmt" podUID="034898e6-7c3d-4144-9f19-a084d99e169f" containerName="controller-manager" containerID="cri-o://e5207ea93df9aac978be1fde04fb718c2d36a1b369eb7725c7ccbfde77c38bfd" gracePeriod=30 Oct 11 04:05:59 crc kubenswrapper[4703]: I1011 04:05:59.831793 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-5685dcbdbc-4xgmt" Oct 11 04:05:59 crc kubenswrapper[4703]: I1011 04:05:59.836984 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-5685dcbdbc-4xgmt" Oct 11 04:05:59 crc kubenswrapper[4703]: I1011 04:05:59.847968 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-96f4d98db-q6zd9" podStartSLOduration=1.847949139 podStartE2EDuration="1.847949139s" podCreationTimestamp="2025-10-11 04:05:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 04:05:59.84459232 +0000 UTC m=+651.055074252" watchObservedRunningTime="2025-10-11 04:05:59.847949139 +0000 UTC m=+651.058431061" Oct 11 04:05:59 crc kubenswrapper[4703]: I1011 04:05:59.852791 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-96f4d98db-q6zd9" Oct 11 04:05:59 crc kubenswrapper[4703]: I1011 04:05:59.882373 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-5685dcbdbc-4xgmt" podStartSLOduration=3.8823564680000002 podStartE2EDuration="3.882356468s" podCreationTimestamp="2025-10-11 04:05:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 04:05:59.868360608 +0000 UTC m=+651.078842530" watchObservedRunningTime="2025-10-11 04:05:59.882356468 +0000 UTC m=+651.092838390" Oct 11 04:06:00 crc kubenswrapper[4703]: I1011 04:06:00.217907 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5685dcbdbc-4xgmt" Oct 11 04:06:00 crc kubenswrapper[4703]: I1011 04:06:00.218903 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/034898e6-7c3d-4144-9f19-a084d99e169f-client-ca\") pod \"034898e6-7c3d-4144-9f19-a084d99e169f\" (UID: \"034898e6-7c3d-4144-9f19-a084d99e169f\") " Oct 11 04:06:00 crc kubenswrapper[4703]: I1011 04:06:00.218971 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/034898e6-7c3d-4144-9f19-a084d99e169f-serving-cert\") pod \"034898e6-7c3d-4144-9f19-a084d99e169f\" (UID: \"034898e6-7c3d-4144-9f19-a084d99e169f\") " Oct 11 04:06:00 crc kubenswrapper[4703]: I1011 04:06:00.219006 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/034898e6-7c3d-4144-9f19-a084d99e169f-proxy-ca-bundles\") pod \"034898e6-7c3d-4144-9f19-a084d99e169f\" (UID: \"034898e6-7c3d-4144-9f19-a084d99e169f\") " Oct 11 04:06:00 crc kubenswrapper[4703]: I1011 04:06:00.219041 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xkv7b\" (UniqueName: \"kubernetes.io/projected/034898e6-7c3d-4144-9f19-a084d99e169f-kube-api-access-xkv7b\") pod \"034898e6-7c3d-4144-9f19-a084d99e169f\" (UID: \"034898e6-7c3d-4144-9f19-a084d99e169f\") " Oct 11 04:06:00 crc kubenswrapper[4703]: I1011 04:06:00.219645 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/034898e6-7c3d-4144-9f19-a084d99e169f-client-ca" (OuterVolumeSpecName: "client-ca") pod "034898e6-7c3d-4144-9f19-a084d99e169f" (UID: "034898e6-7c3d-4144-9f19-a084d99e169f"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 04:06:00 crc kubenswrapper[4703]: I1011 04:06:00.219707 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/034898e6-7c3d-4144-9f19-a084d99e169f-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "034898e6-7c3d-4144-9f19-a084d99e169f" (UID: "034898e6-7c3d-4144-9f19-a084d99e169f"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 04:06:00 crc kubenswrapper[4703]: I1011 04:06:00.225027 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/034898e6-7c3d-4144-9f19-a084d99e169f-kube-api-access-xkv7b" (OuterVolumeSpecName: "kube-api-access-xkv7b") pod "034898e6-7c3d-4144-9f19-a084d99e169f" (UID: "034898e6-7c3d-4144-9f19-a084d99e169f"). InnerVolumeSpecName "kube-api-access-xkv7b". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 04:06:00 crc kubenswrapper[4703]: I1011 04:06:00.227289 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/034898e6-7c3d-4144-9f19-a084d99e169f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "034898e6-7c3d-4144-9f19-a084d99e169f" (UID: "034898e6-7c3d-4144-9f19-a084d99e169f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 04:06:00 crc kubenswrapper[4703]: I1011 04:06:00.252032 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-78df479679-rvlnz"] Oct 11 04:06:00 crc kubenswrapper[4703]: E1011 04:06:00.252348 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="034898e6-7c3d-4144-9f19-a084d99e169f" containerName="controller-manager" Oct 11 04:06:00 crc kubenswrapper[4703]: I1011 04:06:00.252370 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="034898e6-7c3d-4144-9f19-a084d99e169f" containerName="controller-manager" Oct 11 04:06:00 crc kubenswrapper[4703]: I1011 04:06:00.252502 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="034898e6-7c3d-4144-9f19-a084d99e169f" containerName="controller-manager" Oct 11 04:06:00 crc kubenswrapper[4703]: I1011 04:06:00.252971 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-78df479679-rvlnz" Oct 11 04:06:00 crc kubenswrapper[4703]: I1011 04:06:00.261897 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-78df479679-rvlnz"] Oct 11 04:06:00 crc kubenswrapper[4703]: I1011 04:06:00.319714 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/034898e6-7c3d-4144-9f19-a084d99e169f-config\") pod \"034898e6-7c3d-4144-9f19-a084d99e169f\" (UID: \"034898e6-7c3d-4144-9f19-a084d99e169f\") " Oct 11 04:06:00 crc kubenswrapper[4703]: I1011 04:06:00.319793 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/945b8030-e9bf-4915-8640-fc01e8cb7216-serving-cert\") pod \"controller-manager-78df479679-rvlnz\" (UID: \"945b8030-e9bf-4915-8640-fc01e8cb7216\") " pod="openshift-controller-manager/controller-manager-78df479679-rvlnz" Oct 11 04:06:00 crc kubenswrapper[4703]: I1011 04:06:00.319825 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qhhhf\" (UniqueName: \"kubernetes.io/projected/945b8030-e9bf-4915-8640-fc01e8cb7216-kube-api-access-qhhhf\") pod \"controller-manager-78df479679-rvlnz\" (UID: \"945b8030-e9bf-4915-8640-fc01e8cb7216\") " pod="openshift-controller-manager/controller-manager-78df479679-rvlnz" Oct 11 04:06:00 crc kubenswrapper[4703]: I1011 04:06:00.319841 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/945b8030-e9bf-4915-8640-fc01e8cb7216-config\") pod \"controller-manager-78df479679-rvlnz\" (UID: \"945b8030-e9bf-4915-8640-fc01e8cb7216\") " pod="openshift-controller-manager/controller-manager-78df479679-rvlnz" Oct 11 04:06:00 crc kubenswrapper[4703]: I1011 04:06:00.319857 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/945b8030-e9bf-4915-8640-fc01e8cb7216-client-ca\") pod \"controller-manager-78df479679-rvlnz\" (UID: \"945b8030-e9bf-4915-8640-fc01e8cb7216\") " pod="openshift-controller-manager/controller-manager-78df479679-rvlnz" Oct 11 04:06:00 crc kubenswrapper[4703]: I1011 04:06:00.319890 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/945b8030-e9bf-4915-8640-fc01e8cb7216-proxy-ca-bundles\") pod \"controller-manager-78df479679-rvlnz\" (UID: \"945b8030-e9bf-4915-8640-fc01e8cb7216\") " pod="openshift-controller-manager/controller-manager-78df479679-rvlnz" Oct 11 04:06:00 crc kubenswrapper[4703]: I1011 04:06:00.319945 4703 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/034898e6-7c3d-4144-9f19-a084d99e169f-client-ca\") on node \"crc\" DevicePath \"\"" Oct 11 04:06:00 crc kubenswrapper[4703]: I1011 04:06:00.319956 4703 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/034898e6-7c3d-4144-9f19-a084d99e169f-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 11 04:06:00 crc kubenswrapper[4703]: I1011 04:06:00.319963 4703 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/034898e6-7c3d-4144-9f19-a084d99e169f-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Oct 11 04:06:00 crc kubenswrapper[4703]: I1011 04:06:00.319974 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xkv7b\" (UniqueName: \"kubernetes.io/projected/034898e6-7c3d-4144-9f19-a084d99e169f-kube-api-access-xkv7b\") on node \"crc\" DevicePath \"\"" Oct 11 04:06:00 crc kubenswrapper[4703]: I1011 04:06:00.320457 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/034898e6-7c3d-4144-9f19-a084d99e169f-config" (OuterVolumeSpecName: "config") pod "034898e6-7c3d-4144-9f19-a084d99e169f" (UID: "034898e6-7c3d-4144-9f19-a084d99e169f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 04:06:00 crc kubenswrapper[4703]: I1011 04:06:00.420865 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/945b8030-e9bf-4915-8640-fc01e8cb7216-proxy-ca-bundles\") pod \"controller-manager-78df479679-rvlnz\" (UID: \"945b8030-e9bf-4915-8640-fc01e8cb7216\") " pod="openshift-controller-manager/controller-manager-78df479679-rvlnz" Oct 11 04:06:00 crc kubenswrapper[4703]: I1011 04:06:00.421004 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/945b8030-e9bf-4915-8640-fc01e8cb7216-serving-cert\") pod \"controller-manager-78df479679-rvlnz\" (UID: \"945b8030-e9bf-4915-8640-fc01e8cb7216\") " pod="openshift-controller-manager/controller-manager-78df479679-rvlnz" Oct 11 04:06:00 crc kubenswrapper[4703]: I1011 04:06:00.421068 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qhhhf\" (UniqueName: \"kubernetes.io/projected/945b8030-e9bf-4915-8640-fc01e8cb7216-kube-api-access-qhhhf\") pod \"controller-manager-78df479679-rvlnz\" (UID: \"945b8030-e9bf-4915-8640-fc01e8cb7216\") " pod="openshift-controller-manager/controller-manager-78df479679-rvlnz" Oct 11 04:06:00 crc kubenswrapper[4703]: I1011 04:06:00.421105 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/945b8030-e9bf-4915-8640-fc01e8cb7216-config\") pod \"controller-manager-78df479679-rvlnz\" (UID: \"945b8030-e9bf-4915-8640-fc01e8cb7216\") " pod="openshift-controller-manager/controller-manager-78df479679-rvlnz" Oct 11 04:06:00 crc kubenswrapper[4703]: I1011 04:06:00.421135 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/945b8030-e9bf-4915-8640-fc01e8cb7216-client-ca\") pod \"controller-manager-78df479679-rvlnz\" (UID: \"945b8030-e9bf-4915-8640-fc01e8cb7216\") " pod="openshift-controller-manager/controller-manager-78df479679-rvlnz" Oct 11 04:06:00 crc kubenswrapper[4703]: I1011 04:06:00.421211 4703 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/034898e6-7c3d-4144-9f19-a084d99e169f-config\") on node \"crc\" DevicePath \"\"" Oct 11 04:06:00 crc kubenswrapper[4703]: I1011 04:06:00.422525 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/945b8030-e9bf-4915-8640-fc01e8cb7216-client-ca\") pod \"controller-manager-78df479679-rvlnz\" (UID: \"945b8030-e9bf-4915-8640-fc01e8cb7216\") " pod="openshift-controller-manager/controller-manager-78df479679-rvlnz" Oct 11 04:06:00 crc kubenswrapper[4703]: I1011 04:06:00.422907 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/945b8030-e9bf-4915-8640-fc01e8cb7216-proxy-ca-bundles\") pod \"controller-manager-78df479679-rvlnz\" (UID: \"945b8030-e9bf-4915-8640-fc01e8cb7216\") " pod="openshift-controller-manager/controller-manager-78df479679-rvlnz" Oct 11 04:06:00 crc kubenswrapper[4703]: I1011 04:06:00.423563 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/945b8030-e9bf-4915-8640-fc01e8cb7216-config\") pod \"controller-manager-78df479679-rvlnz\" (UID: \"945b8030-e9bf-4915-8640-fc01e8cb7216\") " pod="openshift-controller-manager/controller-manager-78df479679-rvlnz" Oct 11 04:06:00 crc kubenswrapper[4703]: I1011 04:06:00.428835 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/945b8030-e9bf-4915-8640-fc01e8cb7216-serving-cert\") pod \"controller-manager-78df479679-rvlnz\" (UID: \"945b8030-e9bf-4915-8640-fc01e8cb7216\") " pod="openshift-controller-manager/controller-manager-78df479679-rvlnz" Oct 11 04:06:00 crc kubenswrapper[4703]: I1011 04:06:00.439169 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qhhhf\" (UniqueName: \"kubernetes.io/projected/945b8030-e9bf-4915-8640-fc01e8cb7216-kube-api-access-qhhhf\") pod \"controller-manager-78df479679-rvlnz\" (UID: \"945b8030-e9bf-4915-8640-fc01e8cb7216\") " pod="openshift-controller-manager/controller-manager-78df479679-rvlnz" Oct 11 04:06:00 crc kubenswrapper[4703]: I1011 04:06:00.584013 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-78df479679-rvlnz" Oct 11 04:06:00 crc kubenswrapper[4703]: I1011 04:06:00.847241 4703 generic.go:334] "Generic (PLEG): container finished" podID="034898e6-7c3d-4144-9f19-a084d99e169f" containerID="e5207ea93df9aac978be1fde04fb718c2d36a1b369eb7725c7ccbfde77c38bfd" exitCode=0 Oct 11 04:06:00 crc kubenswrapper[4703]: I1011 04:06:00.847884 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5685dcbdbc-4xgmt" Oct 11 04:06:00 crc kubenswrapper[4703]: I1011 04:06:00.847940 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5685dcbdbc-4xgmt" event={"ID":"034898e6-7c3d-4144-9f19-a084d99e169f","Type":"ContainerDied","Data":"e5207ea93df9aac978be1fde04fb718c2d36a1b369eb7725c7ccbfde77c38bfd"} Oct 11 04:06:00 crc kubenswrapper[4703]: I1011 04:06:00.848091 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5685dcbdbc-4xgmt" event={"ID":"034898e6-7c3d-4144-9f19-a084d99e169f","Type":"ContainerDied","Data":"142a3c6c06b5f9e1296fa3694b876436be773521a8e9ecbb743bcc8fc191f9f8"} Oct 11 04:06:00 crc kubenswrapper[4703]: I1011 04:06:00.848126 4703 scope.go:117] "RemoveContainer" containerID="e5207ea93df9aac978be1fde04fb718c2d36a1b369eb7725c7ccbfde77c38bfd" Oct 11 04:06:00 crc kubenswrapper[4703]: I1011 04:06:00.855808 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-78df479679-rvlnz"] Oct 11 04:06:00 crc kubenswrapper[4703]: I1011 04:06:00.868272 4703 scope.go:117] "RemoveContainer" containerID="e5207ea93df9aac978be1fde04fb718c2d36a1b369eb7725c7ccbfde77c38bfd" Oct 11 04:06:00 crc kubenswrapper[4703]: E1011 04:06:00.868885 4703 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e5207ea93df9aac978be1fde04fb718c2d36a1b369eb7725c7ccbfde77c38bfd\": container with ID starting with e5207ea93df9aac978be1fde04fb718c2d36a1b369eb7725c7ccbfde77c38bfd not found: ID does not exist" containerID="e5207ea93df9aac978be1fde04fb718c2d36a1b369eb7725c7ccbfde77c38bfd" Oct 11 04:06:00 crc kubenswrapper[4703]: I1011 04:06:00.868914 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e5207ea93df9aac978be1fde04fb718c2d36a1b369eb7725c7ccbfde77c38bfd"} err="failed to get container status \"e5207ea93df9aac978be1fde04fb718c2d36a1b369eb7725c7ccbfde77c38bfd\": rpc error: code = NotFound desc = could not find container \"e5207ea93df9aac978be1fde04fb718c2d36a1b369eb7725c7ccbfde77c38bfd\": container with ID starting with e5207ea93df9aac978be1fde04fb718c2d36a1b369eb7725c7ccbfde77c38bfd not found: ID does not exist" Oct 11 04:06:00 crc kubenswrapper[4703]: I1011 04:06:00.880009 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5685dcbdbc-4xgmt"] Oct 11 04:06:00 crc kubenswrapper[4703]: I1011 04:06:00.883559 4703 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-5685dcbdbc-4xgmt"] Oct 11 04:06:01 crc kubenswrapper[4703]: I1011 04:06:01.542844 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="034898e6-7c3d-4144-9f19-a084d99e169f" path="/var/lib/kubelet/pods/034898e6-7c3d-4144-9f19-a084d99e169f/volumes" Oct 11 04:06:01 crc kubenswrapper[4703]: I1011 04:06:01.855352 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-78df479679-rvlnz" event={"ID":"945b8030-e9bf-4915-8640-fc01e8cb7216","Type":"ContainerStarted","Data":"ff28e8361ce18241944b69d236b87ff88def37313fecf6b800b91d1cd3126e4c"} Oct 11 04:06:01 crc kubenswrapper[4703]: I1011 04:06:01.855511 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-78df479679-rvlnz" event={"ID":"945b8030-e9bf-4915-8640-fc01e8cb7216","Type":"ContainerStarted","Data":"4ad22238d26aad906e231262b39aa28dad26468f923a11dbabd2df4321f0c0a7"} Oct 11 04:06:01 crc kubenswrapper[4703]: I1011 04:06:01.855861 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-78df479679-rvlnz" Oct 11 04:06:01 crc kubenswrapper[4703]: I1011 04:06:01.864308 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-78df479679-rvlnz" Oct 11 04:06:01 crc kubenswrapper[4703]: I1011 04:06:01.877002 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-78df479679-rvlnz" podStartSLOduration=3.876984919 podStartE2EDuration="3.876984919s" podCreationTimestamp="2025-10-11 04:05:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 04:06:01.874055542 +0000 UTC m=+653.084537474" watchObservedRunningTime="2025-10-11 04:06:01.876984919 +0000 UTC m=+653.087466841" Oct 11 04:06:01 crc kubenswrapper[4703]: I1011 04:06:01.991165 4703 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/infra-operator-index-vmm7z" Oct 11 04:06:01 crc kubenswrapper[4703]: I1011 04:06:01.991219 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-index-vmm7z" Oct 11 04:06:02 crc kubenswrapper[4703]: I1011 04:06:02.035272 4703 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/infra-operator-index-vmm7z" Oct 11 04:06:02 crc kubenswrapper[4703]: I1011 04:06:02.898118 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-index-vmm7z" Oct 11 04:06:04 crc kubenswrapper[4703]: I1011 04:06:04.714888 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/c090bdbd66b297a4e0a96dc60ec40a8cadf15db03c232395ee40d8ae6cl6jss"] Oct 11 04:06:04 crc kubenswrapper[4703]: I1011 04:06:04.716500 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/c090bdbd66b297a4e0a96dc60ec40a8cadf15db03c232395ee40d8ae6cl6jss" Oct 11 04:06:04 crc kubenswrapper[4703]: I1011 04:06:04.718868 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-vxrk2" Oct 11 04:06:04 crc kubenswrapper[4703]: I1011 04:06:04.735698 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/c090bdbd66b297a4e0a96dc60ec40a8cadf15db03c232395ee40d8ae6cl6jss"] Oct 11 04:06:04 crc kubenswrapper[4703]: I1011 04:06:04.791594 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4fc4bd20-b803-4781-92ce-d7ecf349398d-bundle\") pod \"c090bdbd66b297a4e0a96dc60ec40a8cadf15db03c232395ee40d8ae6cl6jss\" (UID: \"4fc4bd20-b803-4781-92ce-d7ecf349398d\") " pod="openstack-operators/c090bdbd66b297a4e0a96dc60ec40a8cadf15db03c232395ee40d8ae6cl6jss" Oct 11 04:06:04 crc kubenswrapper[4703]: I1011 04:06:04.791761 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4fc4bd20-b803-4781-92ce-d7ecf349398d-util\") pod \"c090bdbd66b297a4e0a96dc60ec40a8cadf15db03c232395ee40d8ae6cl6jss\" (UID: \"4fc4bd20-b803-4781-92ce-d7ecf349398d\") " pod="openstack-operators/c090bdbd66b297a4e0a96dc60ec40a8cadf15db03c232395ee40d8ae6cl6jss" Oct 11 04:06:04 crc kubenswrapper[4703]: I1011 04:06:04.791919 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cnfds\" (UniqueName: \"kubernetes.io/projected/4fc4bd20-b803-4781-92ce-d7ecf349398d-kube-api-access-cnfds\") pod \"c090bdbd66b297a4e0a96dc60ec40a8cadf15db03c232395ee40d8ae6cl6jss\" (UID: \"4fc4bd20-b803-4781-92ce-d7ecf349398d\") " pod="openstack-operators/c090bdbd66b297a4e0a96dc60ec40a8cadf15db03c232395ee40d8ae6cl6jss" Oct 11 04:06:04 crc kubenswrapper[4703]: I1011 04:06:04.892893 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cnfds\" (UniqueName: \"kubernetes.io/projected/4fc4bd20-b803-4781-92ce-d7ecf349398d-kube-api-access-cnfds\") pod \"c090bdbd66b297a4e0a96dc60ec40a8cadf15db03c232395ee40d8ae6cl6jss\" (UID: \"4fc4bd20-b803-4781-92ce-d7ecf349398d\") " pod="openstack-operators/c090bdbd66b297a4e0a96dc60ec40a8cadf15db03c232395ee40d8ae6cl6jss" Oct 11 04:06:04 crc kubenswrapper[4703]: I1011 04:06:04.892959 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4fc4bd20-b803-4781-92ce-d7ecf349398d-bundle\") pod \"c090bdbd66b297a4e0a96dc60ec40a8cadf15db03c232395ee40d8ae6cl6jss\" (UID: \"4fc4bd20-b803-4781-92ce-d7ecf349398d\") " pod="openstack-operators/c090bdbd66b297a4e0a96dc60ec40a8cadf15db03c232395ee40d8ae6cl6jss" Oct 11 04:06:04 crc kubenswrapper[4703]: I1011 04:06:04.893003 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4fc4bd20-b803-4781-92ce-d7ecf349398d-util\") pod \"c090bdbd66b297a4e0a96dc60ec40a8cadf15db03c232395ee40d8ae6cl6jss\" (UID: \"4fc4bd20-b803-4781-92ce-d7ecf349398d\") " pod="openstack-operators/c090bdbd66b297a4e0a96dc60ec40a8cadf15db03c232395ee40d8ae6cl6jss" Oct 11 04:06:04 crc kubenswrapper[4703]: I1011 04:06:04.893592 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4fc4bd20-b803-4781-92ce-d7ecf349398d-util\") pod \"c090bdbd66b297a4e0a96dc60ec40a8cadf15db03c232395ee40d8ae6cl6jss\" (UID: \"4fc4bd20-b803-4781-92ce-d7ecf349398d\") " pod="openstack-operators/c090bdbd66b297a4e0a96dc60ec40a8cadf15db03c232395ee40d8ae6cl6jss" Oct 11 04:06:04 crc kubenswrapper[4703]: I1011 04:06:04.893598 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4fc4bd20-b803-4781-92ce-d7ecf349398d-bundle\") pod \"c090bdbd66b297a4e0a96dc60ec40a8cadf15db03c232395ee40d8ae6cl6jss\" (UID: \"4fc4bd20-b803-4781-92ce-d7ecf349398d\") " pod="openstack-operators/c090bdbd66b297a4e0a96dc60ec40a8cadf15db03c232395ee40d8ae6cl6jss" Oct 11 04:06:04 crc kubenswrapper[4703]: I1011 04:06:04.924124 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cnfds\" (UniqueName: \"kubernetes.io/projected/4fc4bd20-b803-4781-92ce-d7ecf349398d-kube-api-access-cnfds\") pod \"c090bdbd66b297a4e0a96dc60ec40a8cadf15db03c232395ee40d8ae6cl6jss\" (UID: \"4fc4bd20-b803-4781-92ce-d7ecf349398d\") " pod="openstack-operators/c090bdbd66b297a4e0a96dc60ec40a8cadf15db03c232395ee40d8ae6cl6jss" Oct 11 04:06:05 crc kubenswrapper[4703]: I1011 04:06:05.038237 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/c090bdbd66b297a4e0a96dc60ec40a8cadf15db03c232395ee40d8ae6cl6jss" Oct 11 04:06:05 crc kubenswrapper[4703]: I1011 04:06:05.464142 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/c090bdbd66b297a4e0a96dc60ec40a8cadf15db03c232395ee40d8ae6cl6jss"] Oct 11 04:06:05 crc kubenswrapper[4703]: W1011 04:06:05.475269 4703 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4fc4bd20_b803_4781_92ce_d7ecf349398d.slice/crio-169d18c3e9b3a240f151e2a53d7771740440cf4afca8d8184ebe131fa968ad18 WatchSource:0}: Error finding container 169d18c3e9b3a240f151e2a53d7771740440cf4afca8d8184ebe131fa968ad18: Status 404 returned error can't find the container with id 169d18c3e9b3a240f151e2a53d7771740440cf4afca8d8184ebe131fa968ad18 Oct 11 04:06:05 crc kubenswrapper[4703]: I1011 04:06:05.883671 4703 generic.go:334] "Generic (PLEG): container finished" podID="4fc4bd20-b803-4781-92ce-d7ecf349398d" containerID="5b2b1ab98f005d8572b816fb2821dfd694a94137f9e22cfcdc3cd05b8e4411aa" exitCode=0 Oct 11 04:06:05 crc kubenswrapper[4703]: I1011 04:06:05.883754 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/c090bdbd66b297a4e0a96dc60ec40a8cadf15db03c232395ee40d8ae6cl6jss" event={"ID":"4fc4bd20-b803-4781-92ce-d7ecf349398d","Type":"ContainerDied","Data":"5b2b1ab98f005d8572b816fb2821dfd694a94137f9e22cfcdc3cd05b8e4411aa"} Oct 11 04:06:05 crc kubenswrapper[4703]: I1011 04:06:05.884304 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/c090bdbd66b297a4e0a96dc60ec40a8cadf15db03c232395ee40d8ae6cl6jss" event={"ID":"4fc4bd20-b803-4781-92ce-d7ecf349398d","Type":"ContainerStarted","Data":"169d18c3e9b3a240f151e2a53d7771740440cf4afca8d8184ebe131fa968ad18"} Oct 11 04:06:07 crc kubenswrapper[4703]: I1011 04:06:07.117481 4703 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Oct 11 04:06:07 crc kubenswrapper[4703]: I1011 04:06:07.898531 4703 generic.go:334] "Generic (PLEG): container finished" podID="4fc4bd20-b803-4781-92ce-d7ecf349398d" containerID="7fff3793cb08d6e25598f8551eee99217b2d912f78def035e18799805e42e06e" exitCode=0 Oct 11 04:06:07 crc kubenswrapper[4703]: I1011 04:06:07.898659 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/c090bdbd66b297a4e0a96dc60ec40a8cadf15db03c232395ee40d8ae6cl6jss" event={"ID":"4fc4bd20-b803-4781-92ce-d7ecf349398d","Type":"ContainerDied","Data":"7fff3793cb08d6e25598f8551eee99217b2d912f78def035e18799805e42e06e"} Oct 11 04:06:08 crc kubenswrapper[4703]: I1011 04:06:08.909236 4703 generic.go:334] "Generic (PLEG): container finished" podID="4fc4bd20-b803-4781-92ce-d7ecf349398d" containerID="eb1a475a5fcef16d82f33efed7daf4979f1ad2d39d7d3fdbf57249d5f539dbf3" exitCode=0 Oct 11 04:06:08 crc kubenswrapper[4703]: I1011 04:06:08.909301 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/c090bdbd66b297a4e0a96dc60ec40a8cadf15db03c232395ee40d8ae6cl6jss" event={"ID":"4fc4bd20-b803-4781-92ce-d7ecf349398d","Type":"ContainerDied","Data":"eb1a475a5fcef16d82f33efed7daf4979f1ad2d39d7d3fdbf57249d5f539dbf3"} Oct 11 04:06:10 crc kubenswrapper[4703]: I1011 04:06:10.324130 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/c090bdbd66b297a4e0a96dc60ec40a8cadf15db03c232395ee40d8ae6cl6jss" Oct 11 04:06:10 crc kubenswrapper[4703]: I1011 04:06:10.469705 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4fc4bd20-b803-4781-92ce-d7ecf349398d-bundle\") pod \"4fc4bd20-b803-4781-92ce-d7ecf349398d\" (UID: \"4fc4bd20-b803-4781-92ce-d7ecf349398d\") " Oct 11 04:06:10 crc kubenswrapper[4703]: I1011 04:06:10.469796 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4fc4bd20-b803-4781-92ce-d7ecf349398d-util\") pod \"4fc4bd20-b803-4781-92ce-d7ecf349398d\" (UID: \"4fc4bd20-b803-4781-92ce-d7ecf349398d\") " Oct 11 04:06:10 crc kubenswrapper[4703]: I1011 04:06:10.469857 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cnfds\" (UniqueName: \"kubernetes.io/projected/4fc4bd20-b803-4781-92ce-d7ecf349398d-kube-api-access-cnfds\") pod \"4fc4bd20-b803-4781-92ce-d7ecf349398d\" (UID: \"4fc4bd20-b803-4781-92ce-d7ecf349398d\") " Oct 11 04:06:10 crc kubenswrapper[4703]: I1011 04:06:10.471399 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4fc4bd20-b803-4781-92ce-d7ecf349398d-bundle" (OuterVolumeSpecName: "bundle") pod "4fc4bd20-b803-4781-92ce-d7ecf349398d" (UID: "4fc4bd20-b803-4781-92ce-d7ecf349398d"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 04:06:10 crc kubenswrapper[4703]: I1011 04:06:10.479600 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4fc4bd20-b803-4781-92ce-d7ecf349398d-kube-api-access-cnfds" (OuterVolumeSpecName: "kube-api-access-cnfds") pod "4fc4bd20-b803-4781-92ce-d7ecf349398d" (UID: "4fc4bd20-b803-4781-92ce-d7ecf349398d"). InnerVolumeSpecName "kube-api-access-cnfds". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 04:06:10 crc kubenswrapper[4703]: I1011 04:06:10.508712 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4fc4bd20-b803-4781-92ce-d7ecf349398d-util" (OuterVolumeSpecName: "util") pod "4fc4bd20-b803-4781-92ce-d7ecf349398d" (UID: "4fc4bd20-b803-4781-92ce-d7ecf349398d"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 04:06:10 crc kubenswrapper[4703]: I1011 04:06:10.571741 4703 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4fc4bd20-b803-4781-92ce-d7ecf349398d-bundle\") on node \"crc\" DevicePath \"\"" Oct 11 04:06:10 crc kubenswrapper[4703]: I1011 04:06:10.571792 4703 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4fc4bd20-b803-4781-92ce-d7ecf349398d-util\") on node \"crc\" DevicePath \"\"" Oct 11 04:06:10 crc kubenswrapper[4703]: I1011 04:06:10.571815 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cnfds\" (UniqueName: \"kubernetes.io/projected/4fc4bd20-b803-4781-92ce-d7ecf349398d-kube-api-access-cnfds\") on node \"crc\" DevicePath \"\"" Oct 11 04:06:10 crc kubenswrapper[4703]: I1011 04:06:10.928442 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/c090bdbd66b297a4e0a96dc60ec40a8cadf15db03c232395ee40d8ae6cl6jss" event={"ID":"4fc4bd20-b803-4781-92ce-d7ecf349398d","Type":"ContainerDied","Data":"169d18c3e9b3a240f151e2a53d7771740440cf4afca8d8184ebe131fa968ad18"} Oct 11 04:06:10 crc kubenswrapper[4703]: I1011 04:06:10.928534 4703 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="169d18c3e9b3a240f151e2a53d7771740440cf4afca8d8184ebe131fa968ad18" Oct 11 04:06:10 crc kubenswrapper[4703]: I1011 04:06:10.928630 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/c090bdbd66b297a4e0a96dc60ec40a8cadf15db03c232395ee40d8ae6cl6jss" Oct 11 04:06:18 crc kubenswrapper[4703]: I1011 04:06:18.755110 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-65968694dd-z4smp"] Oct 11 04:06:18 crc kubenswrapper[4703]: E1011 04:06:18.755864 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4fc4bd20-b803-4781-92ce-d7ecf349398d" containerName="util" Oct 11 04:06:18 crc kubenswrapper[4703]: I1011 04:06:18.755878 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="4fc4bd20-b803-4781-92ce-d7ecf349398d" containerName="util" Oct 11 04:06:18 crc kubenswrapper[4703]: E1011 04:06:18.755894 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4fc4bd20-b803-4781-92ce-d7ecf349398d" containerName="pull" Oct 11 04:06:18 crc kubenswrapper[4703]: I1011 04:06:18.755901 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="4fc4bd20-b803-4781-92ce-d7ecf349398d" containerName="pull" Oct 11 04:06:18 crc kubenswrapper[4703]: E1011 04:06:18.755918 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4fc4bd20-b803-4781-92ce-d7ecf349398d" containerName="extract" Oct 11 04:06:18 crc kubenswrapper[4703]: I1011 04:06:18.755926 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="4fc4bd20-b803-4781-92ce-d7ecf349398d" containerName="extract" Oct 11 04:06:18 crc kubenswrapper[4703]: I1011 04:06:18.756046 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="4fc4bd20-b803-4781-92ce-d7ecf349398d" containerName="extract" Oct 11 04:06:18 crc kubenswrapper[4703]: I1011 04:06:18.756746 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-65968694dd-z4smp" Oct 11 04:06:18 crc kubenswrapper[4703]: I1011 04:06:18.763486 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-service-cert" Oct 11 04:06:18 crc kubenswrapper[4703]: I1011 04:06:18.763524 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-2frw7" Oct 11 04:06:18 crc kubenswrapper[4703]: I1011 04:06:18.799121 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-65968694dd-z4smp"] Oct 11 04:06:18 crc kubenswrapper[4703]: I1011 04:06:18.878686 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d9faffc6-5f30-4e1b-94e3-49ffb44ca354-webhook-cert\") pod \"infra-operator-controller-manager-65968694dd-z4smp\" (UID: \"d9faffc6-5f30-4e1b-94e3-49ffb44ca354\") " pod="openstack-operators/infra-operator-controller-manager-65968694dd-z4smp" Oct 11 04:06:18 crc kubenswrapper[4703]: I1011 04:06:18.878741 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d9faffc6-5f30-4e1b-94e3-49ffb44ca354-apiservice-cert\") pod \"infra-operator-controller-manager-65968694dd-z4smp\" (UID: \"d9faffc6-5f30-4e1b-94e3-49ffb44ca354\") " pod="openstack-operators/infra-operator-controller-manager-65968694dd-z4smp" Oct 11 04:06:18 crc kubenswrapper[4703]: I1011 04:06:18.878796 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-42kcm\" (UniqueName: \"kubernetes.io/projected/d9faffc6-5f30-4e1b-94e3-49ffb44ca354-kube-api-access-42kcm\") pod \"infra-operator-controller-manager-65968694dd-z4smp\" (UID: \"d9faffc6-5f30-4e1b-94e3-49ffb44ca354\") " pod="openstack-operators/infra-operator-controller-manager-65968694dd-z4smp" Oct 11 04:06:18 crc kubenswrapper[4703]: I1011 04:06:18.979946 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d9faffc6-5f30-4e1b-94e3-49ffb44ca354-webhook-cert\") pod \"infra-operator-controller-manager-65968694dd-z4smp\" (UID: \"d9faffc6-5f30-4e1b-94e3-49ffb44ca354\") " pod="openstack-operators/infra-operator-controller-manager-65968694dd-z4smp" Oct 11 04:06:18 crc kubenswrapper[4703]: I1011 04:06:18.980022 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d9faffc6-5f30-4e1b-94e3-49ffb44ca354-apiservice-cert\") pod \"infra-operator-controller-manager-65968694dd-z4smp\" (UID: \"d9faffc6-5f30-4e1b-94e3-49ffb44ca354\") " pod="openstack-operators/infra-operator-controller-manager-65968694dd-z4smp" Oct 11 04:06:18 crc kubenswrapper[4703]: I1011 04:06:18.980103 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-42kcm\" (UniqueName: \"kubernetes.io/projected/d9faffc6-5f30-4e1b-94e3-49ffb44ca354-kube-api-access-42kcm\") pod \"infra-operator-controller-manager-65968694dd-z4smp\" (UID: \"d9faffc6-5f30-4e1b-94e3-49ffb44ca354\") " pod="openstack-operators/infra-operator-controller-manager-65968694dd-z4smp" Oct 11 04:06:18 crc kubenswrapper[4703]: I1011 04:06:18.985698 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d9faffc6-5f30-4e1b-94e3-49ffb44ca354-apiservice-cert\") pod \"infra-operator-controller-manager-65968694dd-z4smp\" (UID: \"d9faffc6-5f30-4e1b-94e3-49ffb44ca354\") " pod="openstack-operators/infra-operator-controller-manager-65968694dd-z4smp" Oct 11 04:06:18 crc kubenswrapper[4703]: I1011 04:06:18.995760 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-42kcm\" (UniqueName: \"kubernetes.io/projected/d9faffc6-5f30-4e1b-94e3-49ffb44ca354-kube-api-access-42kcm\") pod \"infra-operator-controller-manager-65968694dd-z4smp\" (UID: \"d9faffc6-5f30-4e1b-94e3-49ffb44ca354\") " pod="openstack-operators/infra-operator-controller-manager-65968694dd-z4smp" Oct 11 04:06:19 crc kubenswrapper[4703]: I1011 04:06:18.999981 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d9faffc6-5f30-4e1b-94e3-49ffb44ca354-webhook-cert\") pod \"infra-operator-controller-manager-65968694dd-z4smp\" (UID: \"d9faffc6-5f30-4e1b-94e3-49ffb44ca354\") " pod="openstack-operators/infra-operator-controller-manager-65968694dd-z4smp" Oct 11 04:06:19 crc kubenswrapper[4703]: I1011 04:06:19.075729 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-65968694dd-z4smp" Oct 11 04:06:19 crc kubenswrapper[4703]: I1011 04:06:19.507959 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-65968694dd-z4smp"] Oct 11 04:06:19 crc kubenswrapper[4703]: W1011 04:06:19.515048 4703 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd9faffc6_5f30_4e1b_94e3_49ffb44ca354.slice/crio-e6f406bd0c5045e7dd627ce7f04bc5a4bb0cd4be139df37860f119d7ab520130 WatchSource:0}: Error finding container e6f406bd0c5045e7dd627ce7f04bc5a4bb0cd4be139df37860f119d7ab520130: Status 404 returned error can't find the container with id e6f406bd0c5045e7dd627ce7f04bc5a4bb0cd4be139df37860f119d7ab520130 Oct 11 04:06:19 crc kubenswrapper[4703]: I1011 04:06:19.989758 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-65968694dd-z4smp" event={"ID":"d9faffc6-5f30-4e1b-94e3-49ffb44ca354","Type":"ContainerStarted","Data":"e6f406bd0c5045e7dd627ce7f04bc5a4bb0cd4be139df37860f119d7ab520130"} Oct 11 04:06:22 crc kubenswrapper[4703]: I1011 04:06:22.007702 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-65968694dd-z4smp" event={"ID":"d9faffc6-5f30-4e1b-94e3-49ffb44ca354","Type":"ContainerStarted","Data":"58fac2035f0da4a1181f99b8defae0f28e66222af39cc20b66c0280406aec8d2"} Oct 11 04:06:22 crc kubenswrapper[4703]: I1011 04:06:22.008407 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-65968694dd-z4smp" event={"ID":"d9faffc6-5f30-4e1b-94e3-49ffb44ca354","Type":"ContainerStarted","Data":"d920333bbf851d7baa37c0035f23de15771e1c3f45fe508dbfc21b50b6a93c7a"} Oct 11 04:06:22 crc kubenswrapper[4703]: I1011 04:06:22.008445 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-65968694dd-z4smp" Oct 11 04:06:22 crc kubenswrapper[4703]: I1011 04:06:22.044916 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-65968694dd-z4smp" podStartSLOduration=2.320060288 podStartE2EDuration="4.04489094s" podCreationTimestamp="2025-10-11 04:06:18 +0000 UTC" firstStartedPulling="2025-10-11 04:06:19.517993993 +0000 UTC m=+670.728475915" lastFinishedPulling="2025-10-11 04:06:21.242824635 +0000 UTC m=+672.453306567" observedRunningTime="2025-10-11 04:06:22.042685602 +0000 UTC m=+673.253167584" watchObservedRunningTime="2025-10-11 04:06:22.04489094 +0000 UTC m=+673.255372902" Oct 11 04:06:23 crc kubenswrapper[4703]: I1011 04:06:23.594266 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/openstack-galera-0"] Oct 11 04:06:23 crc kubenswrapper[4703]: I1011 04:06:23.595381 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/openstack-galera-0" Oct 11 04:06:23 crc kubenswrapper[4703]: I1011 04:06:23.598680 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"glance-kuttl-tests"/"openstack-config-data" Oct 11 04:06:23 crc kubenswrapper[4703]: I1011 04:06:23.598790 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"glance-kuttl-tests"/"openshift-service-ca.crt" Oct 11 04:06:23 crc kubenswrapper[4703]: I1011 04:06:23.598875 4703 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"osp-secret" Oct 11 04:06:23 crc kubenswrapper[4703]: I1011 04:06:23.599425 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"glance-kuttl-tests"/"kube-root-ca.crt" Oct 11 04:06:23 crc kubenswrapper[4703]: I1011 04:06:23.599710 4703 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"galera-openstack-dockercfg-rtrzw" Oct 11 04:06:23 crc kubenswrapper[4703]: I1011 04:06:23.600002 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"glance-kuttl-tests"/"openstack-scripts" Oct 11 04:06:23 crc kubenswrapper[4703]: I1011 04:06:23.604509 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/openstack-galera-2"] Oct 11 04:06:23 crc kubenswrapper[4703]: I1011 04:06:23.605393 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/openstack-galera-2" Oct 11 04:06:23 crc kubenswrapper[4703]: I1011 04:06:23.607904 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/openstack-galera-1"] Oct 11 04:06:23 crc kubenswrapper[4703]: I1011 04:06:23.609028 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/openstack-galera-1" Oct 11 04:06:23 crc kubenswrapper[4703]: I1011 04:06:23.615027 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/openstack-galera-0"] Oct 11 04:06:23 crc kubenswrapper[4703]: I1011 04:06:23.634419 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/openstack-galera-1"] Oct 11 04:06:23 crc kubenswrapper[4703]: I1011 04:06:23.642180 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/openstack-galera-2"] Oct 11 04:06:23 crc kubenswrapper[4703]: I1011 04:06:23.739297 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/6973f9a3-172f-4f91-8f2c-4e14b7ee07c2-secrets\") pod \"openstack-galera-1\" (UID: \"6973f9a3-172f-4f91-8f2c-4e14b7ee07c2\") " pod="glance-kuttl-tests/openstack-galera-1" Oct 11 04:06:23 crc kubenswrapper[4703]: I1011 04:06:23.739345 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/54089f16-2b27-4774-bacd-faf623efc8a0-kolla-config\") pod \"openstack-galera-0\" (UID: \"54089f16-2b27-4774-bacd-faf623efc8a0\") " pod="glance-kuttl-tests/openstack-galera-0" Oct 11 04:06:23 crc kubenswrapper[4703]: I1011 04:06:23.739371 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/54089f16-2b27-4774-bacd-faf623efc8a0-operator-scripts\") pod \"openstack-galera-0\" (UID: \"54089f16-2b27-4774-bacd-faf623efc8a0\") " pod="glance-kuttl-tests/openstack-galera-0" Oct 11 04:06:23 crc kubenswrapper[4703]: I1011 04:06:23.739387 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/64938774-1b02-463f-96e3-451096b692d6-config-data-generated\") pod \"openstack-galera-2\" (UID: \"64938774-1b02-463f-96e3-451096b692d6\") " pod="glance-kuttl-tests/openstack-galera-2" Oct 11 04:06:23 crc kubenswrapper[4703]: I1011 04:06:23.739417 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/6973f9a3-172f-4f91-8f2c-4e14b7ee07c2-kolla-config\") pod \"openstack-galera-1\" (UID: \"6973f9a3-172f-4f91-8f2c-4e14b7ee07c2\") " pod="glance-kuttl-tests/openstack-galera-1" Oct 11 04:06:23 crc kubenswrapper[4703]: I1011 04:06:23.739435 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/54089f16-2b27-4774-bacd-faf623efc8a0-secrets\") pod \"openstack-galera-0\" (UID: \"54089f16-2b27-4774-bacd-faf623efc8a0\") " pod="glance-kuttl-tests/openstack-galera-0" Oct 11 04:06:23 crc kubenswrapper[4703]: I1011 04:06:23.739451 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-galera-2\" (UID: \"64938774-1b02-463f-96e3-451096b692d6\") " pod="glance-kuttl-tests/openstack-galera-2" Oct 11 04:06:23 crc kubenswrapper[4703]: I1011 04:06:23.739548 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/64938774-1b02-463f-96e3-451096b692d6-kolla-config\") pod \"openstack-galera-2\" (UID: \"64938774-1b02-463f-96e3-451096b692d6\") " pod="glance-kuttl-tests/openstack-galera-2" Oct 11 04:06:23 crc kubenswrapper[4703]: I1011 04:06:23.739568 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-554fw\" (UniqueName: \"kubernetes.io/projected/6973f9a3-172f-4f91-8f2c-4e14b7ee07c2-kube-api-access-554fw\") pod \"openstack-galera-1\" (UID: \"6973f9a3-172f-4f91-8f2c-4e14b7ee07c2\") " pod="glance-kuttl-tests/openstack-galera-1" Oct 11 04:06:23 crc kubenswrapper[4703]: I1011 04:06:23.739590 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/64938774-1b02-463f-96e3-451096b692d6-secrets\") pod \"openstack-galera-2\" (UID: \"64938774-1b02-463f-96e3-451096b692d6\") " pod="glance-kuttl-tests/openstack-galera-2" Oct 11 04:06:23 crc kubenswrapper[4703]: I1011 04:06:23.739605 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/64938774-1b02-463f-96e3-451096b692d6-operator-scripts\") pod \"openstack-galera-2\" (UID: \"64938774-1b02-463f-96e3-451096b692d6\") " pod="glance-kuttl-tests/openstack-galera-2" Oct 11 04:06:23 crc kubenswrapper[4703]: I1011 04:06:23.739619 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-galera-1\" (UID: \"6973f9a3-172f-4f91-8f2c-4e14b7ee07c2\") " pod="glance-kuttl-tests/openstack-galera-1" Oct 11 04:06:23 crc kubenswrapper[4703]: I1011 04:06:23.739641 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/54089f16-2b27-4774-bacd-faf623efc8a0-config-data-default\") pod \"openstack-galera-0\" (UID: \"54089f16-2b27-4774-bacd-faf623efc8a0\") " pod="glance-kuttl-tests/openstack-galera-0" Oct 11 04:06:23 crc kubenswrapper[4703]: I1011 04:06:23.739662 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/6973f9a3-172f-4f91-8f2c-4e14b7ee07c2-config-data-default\") pod \"openstack-galera-1\" (UID: \"6973f9a3-172f-4f91-8f2c-4e14b7ee07c2\") " pod="glance-kuttl-tests/openstack-galera-1" Oct 11 04:06:23 crc kubenswrapper[4703]: I1011 04:06:23.739679 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/64938774-1b02-463f-96e3-451096b692d6-config-data-default\") pod \"openstack-galera-2\" (UID: \"64938774-1b02-463f-96e3-451096b692d6\") " pod="glance-kuttl-tests/openstack-galera-2" Oct 11 04:06:23 crc kubenswrapper[4703]: I1011 04:06:23.739695 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vzkhc\" (UniqueName: \"kubernetes.io/projected/64938774-1b02-463f-96e3-451096b692d6-kube-api-access-vzkhc\") pod \"openstack-galera-2\" (UID: \"64938774-1b02-463f-96e3-451096b692d6\") " pod="glance-kuttl-tests/openstack-galera-2" Oct 11 04:06:23 crc kubenswrapper[4703]: I1011 04:06:23.739714 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-galera-0\" (UID: \"54089f16-2b27-4774-bacd-faf623efc8a0\") " pod="glance-kuttl-tests/openstack-galera-0" Oct 11 04:06:23 crc kubenswrapper[4703]: I1011 04:06:23.739741 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6973f9a3-172f-4f91-8f2c-4e14b7ee07c2-operator-scripts\") pod \"openstack-galera-1\" (UID: \"6973f9a3-172f-4f91-8f2c-4e14b7ee07c2\") " pod="glance-kuttl-tests/openstack-galera-1" Oct 11 04:06:23 crc kubenswrapper[4703]: I1011 04:06:23.739758 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/54089f16-2b27-4774-bacd-faf623efc8a0-config-data-generated\") pod \"openstack-galera-0\" (UID: \"54089f16-2b27-4774-bacd-faf623efc8a0\") " pod="glance-kuttl-tests/openstack-galera-0" Oct 11 04:06:23 crc kubenswrapper[4703]: I1011 04:06:23.739783 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/6973f9a3-172f-4f91-8f2c-4e14b7ee07c2-config-data-generated\") pod \"openstack-galera-1\" (UID: \"6973f9a3-172f-4f91-8f2c-4e14b7ee07c2\") " pod="glance-kuttl-tests/openstack-galera-1" Oct 11 04:06:23 crc kubenswrapper[4703]: I1011 04:06:23.739801 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9hjls\" (UniqueName: \"kubernetes.io/projected/54089f16-2b27-4774-bacd-faf623efc8a0-kube-api-access-9hjls\") pod \"openstack-galera-0\" (UID: \"54089f16-2b27-4774-bacd-faf623efc8a0\") " pod="glance-kuttl-tests/openstack-galera-0" Oct 11 04:06:23 crc kubenswrapper[4703]: I1011 04:06:23.841385 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/6973f9a3-172f-4f91-8f2c-4e14b7ee07c2-config-data-default\") pod \"openstack-galera-1\" (UID: \"6973f9a3-172f-4f91-8f2c-4e14b7ee07c2\") " pod="glance-kuttl-tests/openstack-galera-1" Oct 11 04:06:23 crc kubenswrapper[4703]: I1011 04:06:23.841440 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/64938774-1b02-463f-96e3-451096b692d6-config-data-default\") pod \"openstack-galera-2\" (UID: \"64938774-1b02-463f-96e3-451096b692d6\") " pod="glance-kuttl-tests/openstack-galera-2" Oct 11 04:06:23 crc kubenswrapper[4703]: I1011 04:06:23.841497 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vzkhc\" (UniqueName: \"kubernetes.io/projected/64938774-1b02-463f-96e3-451096b692d6-kube-api-access-vzkhc\") pod \"openstack-galera-2\" (UID: \"64938774-1b02-463f-96e3-451096b692d6\") " pod="glance-kuttl-tests/openstack-galera-2" Oct 11 04:06:23 crc kubenswrapper[4703]: I1011 04:06:23.841535 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-galera-0\" (UID: \"54089f16-2b27-4774-bacd-faf623efc8a0\") " pod="glance-kuttl-tests/openstack-galera-0" Oct 11 04:06:23 crc kubenswrapper[4703]: I1011 04:06:23.841573 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6973f9a3-172f-4f91-8f2c-4e14b7ee07c2-operator-scripts\") pod \"openstack-galera-1\" (UID: \"6973f9a3-172f-4f91-8f2c-4e14b7ee07c2\") " pod="glance-kuttl-tests/openstack-galera-1" Oct 11 04:06:23 crc kubenswrapper[4703]: I1011 04:06:23.841605 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/54089f16-2b27-4774-bacd-faf623efc8a0-config-data-generated\") pod \"openstack-galera-0\" (UID: \"54089f16-2b27-4774-bacd-faf623efc8a0\") " pod="glance-kuttl-tests/openstack-galera-0" Oct 11 04:06:23 crc kubenswrapper[4703]: I1011 04:06:23.841647 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/6973f9a3-172f-4f91-8f2c-4e14b7ee07c2-config-data-generated\") pod \"openstack-galera-1\" (UID: \"6973f9a3-172f-4f91-8f2c-4e14b7ee07c2\") " pod="glance-kuttl-tests/openstack-galera-1" Oct 11 04:06:23 crc kubenswrapper[4703]: I1011 04:06:23.841674 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9hjls\" (UniqueName: \"kubernetes.io/projected/54089f16-2b27-4774-bacd-faf623efc8a0-kube-api-access-9hjls\") pod \"openstack-galera-0\" (UID: \"54089f16-2b27-4774-bacd-faf623efc8a0\") " pod="glance-kuttl-tests/openstack-galera-0" Oct 11 04:06:23 crc kubenswrapper[4703]: I1011 04:06:23.841707 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/6973f9a3-172f-4f91-8f2c-4e14b7ee07c2-secrets\") pod \"openstack-galera-1\" (UID: \"6973f9a3-172f-4f91-8f2c-4e14b7ee07c2\") " pod="glance-kuttl-tests/openstack-galera-1" Oct 11 04:06:23 crc kubenswrapper[4703]: I1011 04:06:23.841745 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/54089f16-2b27-4774-bacd-faf623efc8a0-kolla-config\") pod \"openstack-galera-0\" (UID: \"54089f16-2b27-4774-bacd-faf623efc8a0\") " pod="glance-kuttl-tests/openstack-galera-0" Oct 11 04:06:23 crc kubenswrapper[4703]: I1011 04:06:23.841785 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/54089f16-2b27-4774-bacd-faf623efc8a0-operator-scripts\") pod \"openstack-galera-0\" (UID: \"54089f16-2b27-4774-bacd-faf623efc8a0\") " pod="glance-kuttl-tests/openstack-galera-0" Oct 11 04:06:23 crc kubenswrapper[4703]: I1011 04:06:23.841816 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/64938774-1b02-463f-96e3-451096b692d6-config-data-generated\") pod \"openstack-galera-2\" (UID: \"64938774-1b02-463f-96e3-451096b692d6\") " pod="glance-kuttl-tests/openstack-galera-2" Oct 11 04:06:23 crc kubenswrapper[4703]: I1011 04:06:23.841874 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/6973f9a3-172f-4f91-8f2c-4e14b7ee07c2-kolla-config\") pod \"openstack-galera-1\" (UID: \"6973f9a3-172f-4f91-8f2c-4e14b7ee07c2\") " pod="glance-kuttl-tests/openstack-galera-1" Oct 11 04:06:23 crc kubenswrapper[4703]: I1011 04:06:23.841879 4703 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-galera-0\" (UID: \"54089f16-2b27-4774-bacd-faf623efc8a0\") device mount path \"/mnt/openstack/pv06\"" pod="glance-kuttl-tests/openstack-galera-0" Oct 11 04:06:23 crc kubenswrapper[4703]: I1011 04:06:23.841908 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-galera-2\" (UID: \"64938774-1b02-463f-96e3-451096b692d6\") " pod="glance-kuttl-tests/openstack-galera-2" Oct 11 04:06:23 crc kubenswrapper[4703]: I1011 04:06:23.841937 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/64938774-1b02-463f-96e3-451096b692d6-kolla-config\") pod \"openstack-galera-2\" (UID: \"64938774-1b02-463f-96e3-451096b692d6\") " pod="glance-kuttl-tests/openstack-galera-2" Oct 11 04:06:23 crc kubenswrapper[4703]: I1011 04:06:23.841962 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/54089f16-2b27-4774-bacd-faf623efc8a0-secrets\") pod \"openstack-galera-0\" (UID: \"54089f16-2b27-4774-bacd-faf623efc8a0\") " pod="glance-kuttl-tests/openstack-galera-0" Oct 11 04:06:23 crc kubenswrapper[4703]: I1011 04:06:23.841992 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-554fw\" (UniqueName: \"kubernetes.io/projected/6973f9a3-172f-4f91-8f2c-4e14b7ee07c2-kube-api-access-554fw\") pod \"openstack-galera-1\" (UID: \"6973f9a3-172f-4f91-8f2c-4e14b7ee07c2\") " pod="glance-kuttl-tests/openstack-galera-1" Oct 11 04:06:23 crc kubenswrapper[4703]: I1011 04:06:23.842027 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/64938774-1b02-463f-96e3-451096b692d6-secrets\") pod \"openstack-galera-2\" (UID: \"64938774-1b02-463f-96e3-451096b692d6\") " pod="glance-kuttl-tests/openstack-galera-2" Oct 11 04:06:23 crc kubenswrapper[4703]: I1011 04:06:23.842059 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/64938774-1b02-463f-96e3-451096b692d6-operator-scripts\") pod \"openstack-galera-2\" (UID: \"64938774-1b02-463f-96e3-451096b692d6\") " pod="glance-kuttl-tests/openstack-galera-2" Oct 11 04:06:23 crc kubenswrapper[4703]: I1011 04:06:23.842090 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-galera-1\" (UID: \"6973f9a3-172f-4f91-8f2c-4e14b7ee07c2\") " pod="glance-kuttl-tests/openstack-galera-1" Oct 11 04:06:23 crc kubenswrapper[4703]: I1011 04:06:23.842135 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/54089f16-2b27-4774-bacd-faf623efc8a0-config-data-default\") pod \"openstack-galera-0\" (UID: \"54089f16-2b27-4774-bacd-faf623efc8a0\") " pod="glance-kuttl-tests/openstack-galera-0" Oct 11 04:06:23 crc kubenswrapper[4703]: I1011 04:06:23.842557 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/64938774-1b02-463f-96e3-451096b692d6-config-data-generated\") pod \"openstack-galera-2\" (UID: \"64938774-1b02-463f-96e3-451096b692d6\") " pod="glance-kuttl-tests/openstack-galera-2" Oct 11 04:06:23 crc kubenswrapper[4703]: I1011 04:06:23.842583 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/6973f9a3-172f-4f91-8f2c-4e14b7ee07c2-config-data-default\") pod \"openstack-galera-1\" (UID: \"6973f9a3-172f-4f91-8f2c-4e14b7ee07c2\") " pod="glance-kuttl-tests/openstack-galera-1" Oct 11 04:06:23 crc kubenswrapper[4703]: I1011 04:06:23.842670 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/64938774-1b02-463f-96e3-451096b692d6-config-data-default\") pod \"openstack-galera-2\" (UID: \"64938774-1b02-463f-96e3-451096b692d6\") " pod="glance-kuttl-tests/openstack-galera-2" Oct 11 04:06:23 crc kubenswrapper[4703]: I1011 04:06:23.842885 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/64938774-1b02-463f-96e3-451096b692d6-kolla-config\") pod \"openstack-galera-2\" (UID: \"64938774-1b02-463f-96e3-451096b692d6\") " pod="glance-kuttl-tests/openstack-galera-2" Oct 11 04:06:23 crc kubenswrapper[4703]: I1011 04:06:23.843002 4703 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-galera-2\" (UID: \"64938774-1b02-463f-96e3-451096b692d6\") device mount path \"/mnt/openstack/pv03\"" pod="glance-kuttl-tests/openstack-galera-2" Oct 11 04:06:23 crc kubenswrapper[4703]: I1011 04:06:23.843259 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/6973f9a3-172f-4f91-8f2c-4e14b7ee07c2-kolla-config\") pod \"openstack-galera-1\" (UID: \"6973f9a3-172f-4f91-8f2c-4e14b7ee07c2\") " pod="glance-kuttl-tests/openstack-galera-1" Oct 11 04:06:23 crc kubenswrapper[4703]: I1011 04:06:23.843343 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/54089f16-2b27-4774-bacd-faf623efc8a0-kolla-config\") pod \"openstack-galera-0\" (UID: \"54089f16-2b27-4774-bacd-faf623efc8a0\") " pod="glance-kuttl-tests/openstack-galera-0" Oct 11 04:06:23 crc kubenswrapper[4703]: I1011 04:06:23.843368 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/54089f16-2b27-4774-bacd-faf623efc8a0-config-data-default\") pod \"openstack-galera-0\" (UID: \"54089f16-2b27-4774-bacd-faf623efc8a0\") " pod="glance-kuttl-tests/openstack-galera-0" Oct 11 04:06:23 crc kubenswrapper[4703]: I1011 04:06:23.843410 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6973f9a3-172f-4f91-8f2c-4e14b7ee07c2-operator-scripts\") pod \"openstack-galera-1\" (UID: \"6973f9a3-172f-4f91-8f2c-4e14b7ee07c2\") " pod="glance-kuttl-tests/openstack-galera-1" Oct 11 04:06:23 crc kubenswrapper[4703]: I1011 04:06:23.843538 4703 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-galera-1\" (UID: \"6973f9a3-172f-4f91-8f2c-4e14b7ee07c2\") device mount path \"/mnt/openstack/pv07\"" pod="glance-kuttl-tests/openstack-galera-1" Oct 11 04:06:23 crc kubenswrapper[4703]: I1011 04:06:23.842092 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/54089f16-2b27-4774-bacd-faf623efc8a0-config-data-generated\") pod \"openstack-galera-0\" (UID: \"54089f16-2b27-4774-bacd-faf623efc8a0\") " pod="glance-kuttl-tests/openstack-galera-0" Oct 11 04:06:23 crc kubenswrapper[4703]: I1011 04:06:23.843748 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/6973f9a3-172f-4f91-8f2c-4e14b7ee07c2-config-data-generated\") pod \"openstack-galera-1\" (UID: \"6973f9a3-172f-4f91-8f2c-4e14b7ee07c2\") " pod="glance-kuttl-tests/openstack-galera-1" Oct 11 04:06:23 crc kubenswrapper[4703]: I1011 04:06:23.844816 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/54089f16-2b27-4774-bacd-faf623efc8a0-operator-scripts\") pod \"openstack-galera-0\" (UID: \"54089f16-2b27-4774-bacd-faf623efc8a0\") " pod="glance-kuttl-tests/openstack-galera-0" Oct 11 04:06:23 crc kubenswrapper[4703]: I1011 04:06:23.844937 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/64938774-1b02-463f-96e3-451096b692d6-operator-scripts\") pod \"openstack-galera-2\" (UID: \"64938774-1b02-463f-96e3-451096b692d6\") " pod="glance-kuttl-tests/openstack-galera-2" Oct 11 04:06:23 crc kubenswrapper[4703]: I1011 04:06:23.848833 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/64938774-1b02-463f-96e3-451096b692d6-secrets\") pod \"openstack-galera-2\" (UID: \"64938774-1b02-463f-96e3-451096b692d6\") " pod="glance-kuttl-tests/openstack-galera-2" Oct 11 04:06:23 crc kubenswrapper[4703]: I1011 04:06:23.848881 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/54089f16-2b27-4774-bacd-faf623efc8a0-secrets\") pod \"openstack-galera-0\" (UID: \"54089f16-2b27-4774-bacd-faf623efc8a0\") " pod="glance-kuttl-tests/openstack-galera-0" Oct 11 04:06:23 crc kubenswrapper[4703]: I1011 04:06:23.851220 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/6973f9a3-172f-4f91-8f2c-4e14b7ee07c2-secrets\") pod \"openstack-galera-1\" (UID: \"6973f9a3-172f-4f91-8f2c-4e14b7ee07c2\") " pod="glance-kuttl-tests/openstack-galera-1" Oct 11 04:06:23 crc kubenswrapper[4703]: I1011 04:06:23.860235 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vzkhc\" (UniqueName: \"kubernetes.io/projected/64938774-1b02-463f-96e3-451096b692d6-kube-api-access-vzkhc\") pod \"openstack-galera-2\" (UID: \"64938774-1b02-463f-96e3-451096b692d6\") " pod="glance-kuttl-tests/openstack-galera-2" Oct 11 04:06:23 crc kubenswrapper[4703]: I1011 04:06:23.862865 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-554fw\" (UniqueName: \"kubernetes.io/projected/6973f9a3-172f-4f91-8f2c-4e14b7ee07c2-kube-api-access-554fw\") pod \"openstack-galera-1\" (UID: \"6973f9a3-172f-4f91-8f2c-4e14b7ee07c2\") " pod="glance-kuttl-tests/openstack-galera-1" Oct 11 04:06:23 crc kubenswrapper[4703]: I1011 04:06:23.863582 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-galera-1\" (UID: \"6973f9a3-172f-4f91-8f2c-4e14b7ee07c2\") " pod="glance-kuttl-tests/openstack-galera-1" Oct 11 04:06:23 crc kubenswrapper[4703]: I1011 04:06:23.864188 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-galera-2\" (UID: \"64938774-1b02-463f-96e3-451096b692d6\") " pod="glance-kuttl-tests/openstack-galera-2" Oct 11 04:06:23 crc kubenswrapper[4703]: I1011 04:06:23.867002 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-galera-0\" (UID: \"54089f16-2b27-4774-bacd-faf623efc8a0\") " pod="glance-kuttl-tests/openstack-galera-0" Oct 11 04:06:23 crc kubenswrapper[4703]: I1011 04:06:23.875251 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9hjls\" (UniqueName: \"kubernetes.io/projected/54089f16-2b27-4774-bacd-faf623efc8a0-kube-api-access-9hjls\") pod \"openstack-galera-0\" (UID: \"54089f16-2b27-4774-bacd-faf623efc8a0\") " pod="glance-kuttl-tests/openstack-galera-0" Oct 11 04:06:23 crc kubenswrapper[4703]: I1011 04:06:23.921333 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/openstack-galera-0" Oct 11 04:06:23 crc kubenswrapper[4703]: I1011 04:06:23.932158 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/openstack-galera-2" Oct 11 04:06:23 crc kubenswrapper[4703]: I1011 04:06:23.942027 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/openstack-galera-1" Oct 11 04:06:24 crc kubenswrapper[4703]: I1011 04:06:24.446965 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/openstack-galera-0"] Oct 11 04:06:24 crc kubenswrapper[4703]: W1011 04:06:24.458890 4703 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod54089f16_2b27_4774_bacd_faf623efc8a0.slice/crio-b74c9df55afda92a04de9e486f1f9fba19763f3557a9f415aea84b3d73106f44 WatchSource:0}: Error finding container b74c9df55afda92a04de9e486f1f9fba19763f3557a9f415aea84b3d73106f44: Status 404 returned error can't find the container with id b74c9df55afda92a04de9e486f1f9fba19763f3557a9f415aea84b3d73106f44 Oct 11 04:06:24 crc kubenswrapper[4703]: I1011 04:06:24.527248 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/openstack-galera-2"] Oct 11 04:06:24 crc kubenswrapper[4703]: W1011 04:06:24.534522 4703 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod64938774_1b02_463f_96e3_451096b692d6.slice/crio-6aeeccc625ee4d7db94cb600bf04e6dff07f66830773eb028e0d5ae8ea118948 WatchSource:0}: Error finding container 6aeeccc625ee4d7db94cb600bf04e6dff07f66830773eb028e0d5ae8ea118948: Status 404 returned error can't find the container with id 6aeeccc625ee4d7db94cb600bf04e6dff07f66830773eb028e0d5ae8ea118948 Oct 11 04:06:24 crc kubenswrapper[4703]: I1011 04:06:24.669479 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/openstack-galera-1"] Oct 11 04:06:24 crc kubenswrapper[4703]: W1011 04:06:24.678392 4703 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6973f9a3_172f_4f91_8f2c_4e14b7ee07c2.slice/crio-4edb55581b2c228b1b4f9ca669bdf56ec17e6798ac8de51c36080d9653b72fa0 WatchSource:0}: Error finding container 4edb55581b2c228b1b4f9ca669bdf56ec17e6798ac8de51c36080d9653b72fa0: Status 404 returned error can't find the container with id 4edb55581b2c228b1b4f9ca669bdf56ec17e6798ac8de51c36080d9653b72fa0 Oct 11 04:06:25 crc kubenswrapper[4703]: I1011 04:06:25.030229 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstack-galera-2" event={"ID":"64938774-1b02-463f-96e3-451096b692d6","Type":"ContainerStarted","Data":"6aeeccc625ee4d7db94cb600bf04e6dff07f66830773eb028e0d5ae8ea118948"} Oct 11 04:06:25 crc kubenswrapper[4703]: I1011 04:06:25.031166 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstack-galera-0" event={"ID":"54089f16-2b27-4774-bacd-faf623efc8a0","Type":"ContainerStarted","Data":"b74c9df55afda92a04de9e486f1f9fba19763f3557a9f415aea84b3d73106f44"} Oct 11 04:06:25 crc kubenswrapper[4703]: I1011 04:06:25.032867 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstack-galera-1" event={"ID":"6973f9a3-172f-4f91-8f2c-4e14b7ee07c2","Type":"ContainerStarted","Data":"4edb55581b2c228b1b4f9ca669bdf56ec17e6798ac8de51c36080d9653b72fa0"} Oct 11 04:06:29 crc kubenswrapper[4703]: I1011 04:06:29.081629 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-65968694dd-z4smp" Oct 11 04:06:29 crc kubenswrapper[4703]: I1011 04:06:29.804902 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/memcached-0"] Oct 11 04:06:29 crc kubenswrapper[4703]: I1011 04:06:29.805957 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/memcached-0" Oct 11 04:06:29 crc kubenswrapper[4703]: I1011 04:06:29.808062 4703 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"memcached-memcached-dockercfg-bztnl" Oct 11 04:06:29 crc kubenswrapper[4703]: I1011 04:06:29.808096 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"glance-kuttl-tests"/"memcached-config-data" Oct 11 04:06:29 crc kubenswrapper[4703]: I1011 04:06:29.820301 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/memcached-0"] Oct 11 04:06:29 crc kubenswrapper[4703]: I1011 04:06:29.935576 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/61bf3124-e977-402f-908c-85675bcd26ed-kolla-config\") pod \"memcached-0\" (UID: \"61bf3124-e977-402f-908c-85675bcd26ed\") " pod="glance-kuttl-tests/memcached-0" Oct 11 04:06:29 crc kubenswrapper[4703]: I1011 04:06:29.935638 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/61bf3124-e977-402f-908c-85675bcd26ed-config-data\") pod \"memcached-0\" (UID: \"61bf3124-e977-402f-908c-85675bcd26ed\") " pod="glance-kuttl-tests/memcached-0" Oct 11 04:06:29 crc kubenswrapper[4703]: I1011 04:06:29.935667 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9fx6w\" (UniqueName: \"kubernetes.io/projected/61bf3124-e977-402f-908c-85675bcd26ed-kube-api-access-9fx6w\") pod \"memcached-0\" (UID: \"61bf3124-e977-402f-908c-85675bcd26ed\") " pod="glance-kuttl-tests/memcached-0" Oct 11 04:06:30 crc kubenswrapper[4703]: I1011 04:06:30.036879 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9fx6w\" (UniqueName: \"kubernetes.io/projected/61bf3124-e977-402f-908c-85675bcd26ed-kube-api-access-9fx6w\") pod \"memcached-0\" (UID: \"61bf3124-e977-402f-908c-85675bcd26ed\") " pod="glance-kuttl-tests/memcached-0" Oct 11 04:06:30 crc kubenswrapper[4703]: I1011 04:06:30.036961 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/61bf3124-e977-402f-908c-85675bcd26ed-kolla-config\") pod \"memcached-0\" (UID: \"61bf3124-e977-402f-908c-85675bcd26ed\") " pod="glance-kuttl-tests/memcached-0" Oct 11 04:06:30 crc kubenswrapper[4703]: I1011 04:06:30.037005 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/61bf3124-e977-402f-908c-85675bcd26ed-config-data\") pod \"memcached-0\" (UID: \"61bf3124-e977-402f-908c-85675bcd26ed\") " pod="glance-kuttl-tests/memcached-0" Oct 11 04:06:30 crc kubenswrapper[4703]: I1011 04:06:30.037671 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/61bf3124-e977-402f-908c-85675bcd26ed-config-data\") pod \"memcached-0\" (UID: \"61bf3124-e977-402f-908c-85675bcd26ed\") " pod="glance-kuttl-tests/memcached-0" Oct 11 04:06:30 crc kubenswrapper[4703]: I1011 04:06:30.038352 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/61bf3124-e977-402f-908c-85675bcd26ed-kolla-config\") pod \"memcached-0\" (UID: \"61bf3124-e977-402f-908c-85675bcd26ed\") " pod="glance-kuttl-tests/memcached-0" Oct 11 04:06:30 crc kubenswrapper[4703]: I1011 04:06:30.066352 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9fx6w\" (UniqueName: \"kubernetes.io/projected/61bf3124-e977-402f-908c-85675bcd26ed-kube-api-access-9fx6w\") pod \"memcached-0\" (UID: \"61bf3124-e977-402f-908c-85675bcd26ed\") " pod="glance-kuttl-tests/memcached-0" Oct 11 04:06:30 crc kubenswrapper[4703]: I1011 04:06:30.133481 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/memcached-0" Oct 11 04:06:32 crc kubenswrapper[4703]: I1011 04:06:32.792021 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/memcached-0"] Oct 11 04:06:32 crc kubenswrapper[4703]: I1011 04:06:32.866069 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-index-nmdsx"] Oct 11 04:06:32 crc kubenswrapper[4703]: I1011 04:06:32.867122 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-index-nmdsx" Oct 11 04:06:32 crc kubenswrapper[4703]: I1011 04:06:32.875336 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-index-dockercfg-x7vxz" Oct 11 04:06:32 crc kubenswrapper[4703]: I1011 04:06:32.883890 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-index-nmdsx"] Oct 11 04:06:32 crc kubenswrapper[4703]: I1011 04:06:32.932861 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cxqmv\" (UniqueName: \"kubernetes.io/projected/388c9894-d469-4e1f-a0b1-fe16001c9620-kube-api-access-cxqmv\") pod \"rabbitmq-cluster-operator-index-nmdsx\" (UID: \"388c9894-d469-4e1f-a0b1-fe16001c9620\") " pod="openstack-operators/rabbitmq-cluster-operator-index-nmdsx" Oct 11 04:06:33 crc kubenswrapper[4703]: I1011 04:06:33.034033 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cxqmv\" (UniqueName: \"kubernetes.io/projected/388c9894-d469-4e1f-a0b1-fe16001c9620-kube-api-access-cxqmv\") pod \"rabbitmq-cluster-operator-index-nmdsx\" (UID: \"388c9894-d469-4e1f-a0b1-fe16001c9620\") " pod="openstack-operators/rabbitmq-cluster-operator-index-nmdsx" Oct 11 04:06:33 crc kubenswrapper[4703]: I1011 04:06:33.054226 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cxqmv\" (UniqueName: \"kubernetes.io/projected/388c9894-d469-4e1f-a0b1-fe16001c9620-kube-api-access-cxqmv\") pod \"rabbitmq-cluster-operator-index-nmdsx\" (UID: \"388c9894-d469-4e1f-a0b1-fe16001c9620\") " pod="openstack-operators/rabbitmq-cluster-operator-index-nmdsx" Oct 11 04:06:33 crc kubenswrapper[4703]: I1011 04:06:33.092487 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstack-galera-0" event={"ID":"54089f16-2b27-4774-bacd-faf623efc8a0","Type":"ContainerStarted","Data":"b50acde8cb4267ebc77f69a9a58ffa7d9e9a5f473e1062798017a4efeb0f58d2"} Oct 11 04:06:33 crc kubenswrapper[4703]: I1011 04:06:33.094143 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstack-galera-1" event={"ID":"6973f9a3-172f-4f91-8f2c-4e14b7ee07c2","Type":"ContainerStarted","Data":"5bb773904e7fc48391051bb71da4b756bcc0338419604aea3eb9dd8064100109"} Oct 11 04:06:33 crc kubenswrapper[4703]: I1011 04:06:33.095074 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/memcached-0" event={"ID":"61bf3124-e977-402f-908c-85675bcd26ed","Type":"ContainerStarted","Data":"6472de888f51669d0ef043f3b9cc1abb010c4029579675a16fb83c42ac59da97"} Oct 11 04:06:33 crc kubenswrapper[4703]: I1011 04:06:33.096560 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstack-galera-2" event={"ID":"64938774-1b02-463f-96e3-451096b692d6","Type":"ContainerStarted","Data":"5a4ecb670f99497653f31292a90d3d26b8f09dd6966637a4aac59f7c0d8d60aa"} Oct 11 04:06:33 crc kubenswrapper[4703]: I1011 04:06:33.181683 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-index-nmdsx" Oct 11 04:06:33 crc kubenswrapper[4703]: I1011 04:06:33.767834 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-index-nmdsx"] Oct 11 04:06:33 crc kubenswrapper[4703]: W1011 04:06:33.778808 4703 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod388c9894_d469_4e1f_a0b1_fe16001c9620.slice/crio-54a327f8331d842c866c542e88c3f342e0eb0528289114ca58016fd626121b08 WatchSource:0}: Error finding container 54a327f8331d842c866c542e88c3f342e0eb0528289114ca58016fd626121b08: Status 404 returned error can't find the container with id 54a327f8331d842c866c542e88c3f342e0eb0528289114ca58016fd626121b08 Oct 11 04:06:34 crc kubenswrapper[4703]: I1011 04:06:34.102933 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-index-nmdsx" event={"ID":"388c9894-d469-4e1f-a0b1-fe16001c9620","Type":"ContainerStarted","Data":"54a327f8331d842c866c542e88c3f342e0eb0528289114ca58016fd626121b08"} Oct 11 04:06:36 crc kubenswrapper[4703]: I1011 04:06:36.117897 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/memcached-0" event={"ID":"61bf3124-e977-402f-908c-85675bcd26ed","Type":"ContainerStarted","Data":"4dcf7150603e895b75cb586c8b9dec50783325cf22ef378fcbc7c2815ade6e70"} Oct 11 04:06:36 crc kubenswrapper[4703]: I1011 04:06:36.118158 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/memcached-0" Oct 11 04:06:36 crc kubenswrapper[4703]: I1011 04:06:36.670621 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/memcached-0" podStartSLOduration=5.233337205 podStartE2EDuration="7.670602063s" podCreationTimestamp="2025-10-11 04:06:29 +0000 UTC" firstStartedPulling="2025-10-11 04:06:32.802615627 +0000 UTC m=+684.013097549" lastFinishedPulling="2025-10-11 04:06:35.239880475 +0000 UTC m=+686.450362407" observedRunningTime="2025-10-11 04:06:36.14070136 +0000 UTC m=+687.351183292" watchObservedRunningTime="2025-10-11 04:06:36.670602063 +0000 UTC m=+687.881083985" Oct 11 04:06:36 crc kubenswrapper[4703]: I1011 04:06:36.676745 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-2862j"] Oct 11 04:06:36 crc kubenswrapper[4703]: I1011 04:06:36.677919 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2862j" Oct 11 04:06:36 crc kubenswrapper[4703]: I1011 04:06:36.690840 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2862j"] Oct 11 04:06:36 crc kubenswrapper[4703]: I1011 04:06:36.794598 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2dbc731e-8bd9-4600-9b67-c88dddffe755-utilities\") pod \"certified-operators-2862j\" (UID: \"2dbc731e-8bd9-4600-9b67-c88dddffe755\") " pod="openshift-marketplace/certified-operators-2862j" Oct 11 04:06:36 crc kubenswrapper[4703]: I1011 04:06:36.794805 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2dbc731e-8bd9-4600-9b67-c88dddffe755-catalog-content\") pod \"certified-operators-2862j\" (UID: \"2dbc731e-8bd9-4600-9b67-c88dddffe755\") " pod="openshift-marketplace/certified-operators-2862j" Oct 11 04:06:36 crc kubenswrapper[4703]: I1011 04:06:36.795096 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2vqnd\" (UniqueName: \"kubernetes.io/projected/2dbc731e-8bd9-4600-9b67-c88dddffe755-kube-api-access-2vqnd\") pod \"certified-operators-2862j\" (UID: \"2dbc731e-8bd9-4600-9b67-c88dddffe755\") " pod="openshift-marketplace/certified-operators-2862j" Oct 11 04:06:36 crc kubenswrapper[4703]: I1011 04:06:36.896924 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2vqnd\" (UniqueName: \"kubernetes.io/projected/2dbc731e-8bd9-4600-9b67-c88dddffe755-kube-api-access-2vqnd\") pod \"certified-operators-2862j\" (UID: \"2dbc731e-8bd9-4600-9b67-c88dddffe755\") " pod="openshift-marketplace/certified-operators-2862j" Oct 11 04:06:36 crc kubenswrapper[4703]: I1011 04:06:36.897043 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2dbc731e-8bd9-4600-9b67-c88dddffe755-utilities\") pod \"certified-operators-2862j\" (UID: \"2dbc731e-8bd9-4600-9b67-c88dddffe755\") " pod="openshift-marketplace/certified-operators-2862j" Oct 11 04:06:36 crc kubenswrapper[4703]: I1011 04:06:36.897076 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2dbc731e-8bd9-4600-9b67-c88dddffe755-catalog-content\") pod \"certified-operators-2862j\" (UID: \"2dbc731e-8bd9-4600-9b67-c88dddffe755\") " pod="openshift-marketplace/certified-operators-2862j" Oct 11 04:06:36 crc kubenswrapper[4703]: I1011 04:06:36.897513 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2dbc731e-8bd9-4600-9b67-c88dddffe755-utilities\") pod \"certified-operators-2862j\" (UID: \"2dbc731e-8bd9-4600-9b67-c88dddffe755\") " pod="openshift-marketplace/certified-operators-2862j" Oct 11 04:06:36 crc kubenswrapper[4703]: I1011 04:06:36.897603 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2dbc731e-8bd9-4600-9b67-c88dddffe755-catalog-content\") pod \"certified-operators-2862j\" (UID: \"2dbc731e-8bd9-4600-9b67-c88dddffe755\") " pod="openshift-marketplace/certified-operators-2862j" Oct 11 04:06:36 crc kubenswrapper[4703]: I1011 04:06:36.933346 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2vqnd\" (UniqueName: \"kubernetes.io/projected/2dbc731e-8bd9-4600-9b67-c88dddffe755-kube-api-access-2vqnd\") pod \"certified-operators-2862j\" (UID: \"2dbc731e-8bd9-4600-9b67-c88dddffe755\") " pod="openshift-marketplace/certified-operators-2862j" Oct 11 04:06:37 crc kubenswrapper[4703]: I1011 04:06:37.103256 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2862j" Oct 11 04:06:37 crc kubenswrapper[4703]: I1011 04:06:37.130923 4703 generic.go:334] "Generic (PLEG): container finished" podID="64938774-1b02-463f-96e3-451096b692d6" containerID="5a4ecb670f99497653f31292a90d3d26b8f09dd6966637a4aac59f7c0d8d60aa" exitCode=0 Oct 11 04:06:37 crc kubenswrapper[4703]: I1011 04:06:37.130999 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstack-galera-2" event={"ID":"64938774-1b02-463f-96e3-451096b692d6","Type":"ContainerDied","Data":"5a4ecb670f99497653f31292a90d3d26b8f09dd6966637a4aac59f7c0d8d60aa"} Oct 11 04:06:37 crc kubenswrapper[4703]: I1011 04:06:37.133506 4703 generic.go:334] "Generic (PLEG): container finished" podID="54089f16-2b27-4774-bacd-faf623efc8a0" containerID="b50acde8cb4267ebc77f69a9a58ffa7d9e9a5f473e1062798017a4efeb0f58d2" exitCode=0 Oct 11 04:06:37 crc kubenswrapper[4703]: I1011 04:06:37.133592 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstack-galera-0" event={"ID":"54089f16-2b27-4774-bacd-faf623efc8a0","Type":"ContainerDied","Data":"b50acde8cb4267ebc77f69a9a58ffa7d9e9a5f473e1062798017a4efeb0f58d2"} Oct 11 04:06:37 crc kubenswrapper[4703]: I1011 04:06:37.137207 4703 generic.go:334] "Generic (PLEG): container finished" podID="6973f9a3-172f-4f91-8f2c-4e14b7ee07c2" containerID="5bb773904e7fc48391051bb71da4b756bcc0338419604aea3eb9dd8064100109" exitCode=0 Oct 11 04:06:37 crc kubenswrapper[4703]: I1011 04:06:37.137253 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstack-galera-1" event={"ID":"6973f9a3-172f-4f91-8f2c-4e14b7ee07c2","Type":"ContainerDied","Data":"5bb773904e7fc48391051bb71da4b756bcc0338419604aea3eb9dd8064100109"} Oct 11 04:06:40 crc kubenswrapper[4703]: I1011 04:06:40.135500 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/memcached-0" Oct 11 04:06:41 crc kubenswrapper[4703]: I1011 04:06:41.072254 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2862j"] Oct 11 04:06:41 crc kubenswrapper[4703]: I1011 04:06:41.162624 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstack-galera-0" event={"ID":"54089f16-2b27-4774-bacd-faf623efc8a0","Type":"ContainerStarted","Data":"0a176f8f10dc6f19f294f42298fab9a66bb4e53707c2ed01a71588bf3c7d73dd"} Oct 11 04:06:41 crc kubenswrapper[4703]: I1011 04:06:41.164301 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-index-nmdsx" event={"ID":"388c9894-d469-4e1f-a0b1-fe16001c9620","Type":"ContainerStarted","Data":"84d2424e3a798ae49a66851a889205a23ef1ab13c89ba7916eda9170319c1446"} Oct 11 04:06:41 crc kubenswrapper[4703]: I1011 04:06:41.166623 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstack-galera-1" event={"ID":"6973f9a3-172f-4f91-8f2c-4e14b7ee07c2","Type":"ContainerStarted","Data":"265ebe5d288154c85ac4205175d02c09970a9173f9293109122224a402651387"} Oct 11 04:06:41 crc kubenswrapper[4703]: I1011 04:06:41.168136 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2862j" event={"ID":"2dbc731e-8bd9-4600-9b67-c88dddffe755","Type":"ContainerStarted","Data":"0aa70ca8b20be2b1f5a2094cf4a33975810fcfc35eda074781f7f60fc21a1441"} Oct 11 04:06:41 crc kubenswrapper[4703]: I1011 04:06:41.169959 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstack-galera-2" event={"ID":"64938774-1b02-463f-96e3-451096b692d6","Type":"ContainerStarted","Data":"93a877158e489c3333ebdef42d6991774119c651fcd756ea69664c1527c981de"} Oct 11 04:06:41 crc kubenswrapper[4703]: I1011 04:06:41.199062 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/openstack-galera-0" podStartSLOduration=11.304709035 podStartE2EDuration="19.199042823s" podCreationTimestamp="2025-10-11 04:06:22 +0000 UTC" firstStartedPulling="2025-10-11 04:06:24.460203588 +0000 UTC m=+675.670685550" lastFinishedPulling="2025-10-11 04:06:32.354537405 +0000 UTC m=+683.565019338" observedRunningTime="2025-10-11 04:06:41.19516017 +0000 UTC m=+692.405642112" watchObservedRunningTime="2025-10-11 04:06:41.199042823 +0000 UTC m=+692.409524745" Oct 11 04:06:41 crc kubenswrapper[4703]: I1011 04:06:41.219586 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/openstack-galera-1" podStartSLOduration=11.54554488 podStartE2EDuration="19.219567626s" podCreationTimestamp="2025-10-11 04:06:22 +0000 UTC" firstStartedPulling="2025-10-11 04:06:24.680607532 +0000 UTC m=+675.891089444" lastFinishedPulling="2025-10-11 04:06:32.354630268 +0000 UTC m=+683.565112190" observedRunningTime="2025-10-11 04:06:41.217370717 +0000 UTC m=+692.427852669" watchObservedRunningTime="2025-10-11 04:06:41.219567626 +0000 UTC m=+692.430049548" Oct 11 04:06:41 crc kubenswrapper[4703]: I1011 04:06:41.246531 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/openstack-galera-2" podStartSLOduration=11.354356308 podStartE2EDuration="19.246517778s" podCreationTimestamp="2025-10-11 04:06:22 +0000 UTC" firstStartedPulling="2025-10-11 04:06:24.536640258 +0000 UTC m=+675.747122180" lastFinishedPulling="2025-10-11 04:06:32.428801728 +0000 UTC m=+683.639283650" observedRunningTime="2025-10-11 04:06:41.245925922 +0000 UTC m=+692.456407854" watchObservedRunningTime="2025-10-11 04:06:41.246517778 +0000 UTC m=+692.456999700" Oct 11 04:06:41 crc kubenswrapper[4703]: I1011 04:06:41.267150 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-index-nmdsx" podStartSLOduration=2.343668402 podStartE2EDuration="9.267124143s" podCreationTimestamp="2025-10-11 04:06:32 +0000 UTC" firstStartedPulling="2025-10-11 04:06:33.780968351 +0000 UTC m=+684.991450273" lastFinishedPulling="2025-10-11 04:06:40.704424082 +0000 UTC m=+691.914906014" observedRunningTime="2025-10-11 04:06:41.266541647 +0000 UTC m=+692.477023579" watchObservedRunningTime="2025-10-11 04:06:41.267124143 +0000 UTC m=+692.477606095" Oct 11 04:06:42 crc kubenswrapper[4703]: I1011 04:06:42.178718 4703 generic.go:334] "Generic (PLEG): container finished" podID="2dbc731e-8bd9-4600-9b67-c88dddffe755" containerID="1564a70a35a2e476aca34bacd1701af24406029e671014d57ed894663ea18d79" exitCode=0 Oct 11 04:06:42 crc kubenswrapper[4703]: I1011 04:06:42.178846 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2862j" event={"ID":"2dbc731e-8bd9-4600-9b67-c88dddffe755","Type":"ContainerDied","Data":"1564a70a35a2e476aca34bacd1701af24406029e671014d57ed894663ea18d79"} Oct 11 04:06:43 crc kubenswrapper[4703]: I1011 04:06:43.181878 4703 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/rabbitmq-cluster-operator-index-nmdsx" Oct 11 04:06:43 crc kubenswrapper[4703]: I1011 04:06:43.181943 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/rabbitmq-cluster-operator-index-nmdsx" Oct 11 04:06:43 crc kubenswrapper[4703]: I1011 04:06:43.224333 4703 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/rabbitmq-cluster-operator-index-nmdsx" Oct 11 04:06:43 crc kubenswrapper[4703]: I1011 04:06:43.921567 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/openstack-galera-0" Oct 11 04:06:43 crc kubenswrapper[4703]: I1011 04:06:43.921635 4703 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/openstack-galera-0" Oct 11 04:06:43 crc kubenswrapper[4703]: I1011 04:06:43.933348 4703 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/openstack-galera-2" Oct 11 04:06:43 crc kubenswrapper[4703]: I1011 04:06:43.933432 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/openstack-galera-2" Oct 11 04:06:43 crc kubenswrapper[4703]: I1011 04:06:43.942549 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/openstack-galera-1" Oct 11 04:06:43 crc kubenswrapper[4703]: I1011 04:06:43.942601 4703 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/openstack-galera-1" Oct 11 04:06:44 crc kubenswrapper[4703]: I1011 04:06:44.200110 4703 generic.go:334] "Generic (PLEG): container finished" podID="2dbc731e-8bd9-4600-9b67-c88dddffe755" containerID="c40eb63a31943efdbba26f98d58e76910fb3df247ac34cba780b7c669b4a0487" exitCode=0 Oct 11 04:06:44 crc kubenswrapper[4703]: I1011 04:06:44.202151 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2862j" event={"ID":"2dbc731e-8bd9-4600-9b67-c88dddffe755","Type":"ContainerDied","Data":"c40eb63a31943efdbba26f98d58e76910fb3df247ac34cba780b7c669b4a0487"} Oct 11 04:06:45 crc kubenswrapper[4703]: I1011 04:06:45.210448 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2862j" event={"ID":"2dbc731e-8bd9-4600-9b67-c88dddffe755","Type":"ContainerStarted","Data":"eff502a6b3336c9d6ef18eef39c0c701f051e55497886a4339e9a36ea186a23f"} Oct 11 04:06:45 crc kubenswrapper[4703]: I1011 04:06:45.244023 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-2862j" podStartSLOduration=6.718281611 podStartE2EDuration="9.243999316s" podCreationTimestamp="2025-10-11 04:06:36 +0000 UTC" firstStartedPulling="2025-10-11 04:06:42.181129256 +0000 UTC m=+693.391611208" lastFinishedPulling="2025-10-11 04:06:44.706846971 +0000 UTC m=+695.917328913" observedRunningTime="2025-10-11 04:06:45.238910181 +0000 UTC m=+696.449392113" watchObservedRunningTime="2025-10-11 04:06:45.243999316 +0000 UTC m=+696.454481278" Oct 11 04:06:45 crc kubenswrapper[4703]: I1011 04:06:45.252510 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/rabbitmq-cluster-operator-index-nmdsx" Oct 11 04:06:47 crc kubenswrapper[4703]: I1011 04:06:47.104621 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-2862j" Oct 11 04:06:47 crc kubenswrapper[4703]: I1011 04:06:47.104977 4703 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-2862j" Oct 11 04:06:47 crc kubenswrapper[4703]: I1011 04:06:47.164254 4703 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-2862j" Oct 11 04:06:50 crc kubenswrapper[4703]: I1011 04:06:50.048653 4703 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/openstack-galera-2" Oct 11 04:06:50 crc kubenswrapper[4703]: I1011 04:06:50.117112 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/openstack-galera-2" Oct 11 04:06:50 crc kubenswrapper[4703]: E1011 04:06:50.391144 4703 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.162:41698->38.102.83.162:38355: write tcp 38.102.83.162:41698->38.102.83.162:38355: write: broken pipe Oct 11 04:06:53 crc kubenswrapper[4703]: I1011 04:06:53.122177 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590m778n"] Oct 11 04:06:53 crc kubenswrapper[4703]: I1011 04:06:53.125310 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590m778n" Oct 11 04:06:53 crc kubenswrapper[4703]: I1011 04:06:53.128074 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-vxrk2" Oct 11 04:06:53 crc kubenswrapper[4703]: I1011 04:06:53.130719 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590m778n"] Oct 11 04:06:53 crc kubenswrapper[4703]: I1011 04:06:53.239719 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wbgvx\" (UniqueName: \"kubernetes.io/projected/43dd0973-48f7-42ac-b4a9-fa2373381562-kube-api-access-wbgvx\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590m778n\" (UID: \"43dd0973-48f7-42ac-b4a9-fa2373381562\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590m778n" Oct 11 04:06:53 crc kubenswrapper[4703]: I1011 04:06:53.239803 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/43dd0973-48f7-42ac-b4a9-fa2373381562-util\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590m778n\" (UID: \"43dd0973-48f7-42ac-b4a9-fa2373381562\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590m778n" Oct 11 04:06:53 crc kubenswrapper[4703]: I1011 04:06:53.239832 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/43dd0973-48f7-42ac-b4a9-fa2373381562-bundle\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590m778n\" (UID: \"43dd0973-48f7-42ac-b4a9-fa2373381562\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590m778n" Oct 11 04:06:53 crc kubenswrapper[4703]: I1011 04:06:53.340906 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/43dd0973-48f7-42ac-b4a9-fa2373381562-util\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590m778n\" (UID: \"43dd0973-48f7-42ac-b4a9-fa2373381562\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590m778n" Oct 11 04:06:53 crc kubenswrapper[4703]: I1011 04:06:53.340969 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/43dd0973-48f7-42ac-b4a9-fa2373381562-bundle\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590m778n\" (UID: \"43dd0973-48f7-42ac-b4a9-fa2373381562\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590m778n" Oct 11 04:06:53 crc kubenswrapper[4703]: I1011 04:06:53.341048 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wbgvx\" (UniqueName: \"kubernetes.io/projected/43dd0973-48f7-42ac-b4a9-fa2373381562-kube-api-access-wbgvx\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590m778n\" (UID: \"43dd0973-48f7-42ac-b4a9-fa2373381562\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590m778n" Oct 11 04:06:53 crc kubenswrapper[4703]: I1011 04:06:53.341595 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/43dd0973-48f7-42ac-b4a9-fa2373381562-bundle\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590m778n\" (UID: \"43dd0973-48f7-42ac-b4a9-fa2373381562\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590m778n" Oct 11 04:06:53 crc kubenswrapper[4703]: I1011 04:06:53.341608 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/43dd0973-48f7-42ac-b4a9-fa2373381562-util\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590m778n\" (UID: \"43dd0973-48f7-42ac-b4a9-fa2373381562\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590m778n" Oct 11 04:06:53 crc kubenswrapper[4703]: I1011 04:06:53.361869 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wbgvx\" (UniqueName: \"kubernetes.io/projected/43dd0973-48f7-42ac-b4a9-fa2373381562-kube-api-access-wbgvx\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590m778n\" (UID: \"43dd0973-48f7-42ac-b4a9-fa2373381562\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590m778n" Oct 11 04:06:53 crc kubenswrapper[4703]: I1011 04:06:53.446487 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590m778n" Oct 11 04:06:53 crc kubenswrapper[4703]: I1011 04:06:53.953371 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590m778n"] Oct 11 04:06:53 crc kubenswrapper[4703]: W1011 04:06:53.959644 4703 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod43dd0973_48f7_42ac_b4a9_fa2373381562.slice/crio-4a5a7db6ddcad80da9ad974810b55f9283765251ea4d5be47d45844ea5e1c090 WatchSource:0}: Error finding container 4a5a7db6ddcad80da9ad974810b55f9283765251ea4d5be47d45844ea5e1c090: Status 404 returned error can't find the container with id 4a5a7db6ddcad80da9ad974810b55f9283765251ea4d5be47d45844ea5e1c090 Oct 11 04:06:54 crc kubenswrapper[4703]: I1011 04:06:54.278231 4703 generic.go:334] "Generic (PLEG): container finished" podID="43dd0973-48f7-42ac-b4a9-fa2373381562" containerID="e5cc89514f45e4e0472a4e486d4abe5cdad105a31b442c25efe4aa823ad64e19" exitCode=0 Oct 11 04:06:54 crc kubenswrapper[4703]: I1011 04:06:54.278352 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590m778n" event={"ID":"43dd0973-48f7-42ac-b4a9-fa2373381562","Type":"ContainerDied","Data":"e5cc89514f45e4e0472a4e486d4abe5cdad105a31b442c25efe4aa823ad64e19"} Oct 11 04:06:54 crc kubenswrapper[4703]: I1011 04:06:54.278620 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590m778n" event={"ID":"43dd0973-48f7-42ac-b4a9-fa2373381562","Type":"ContainerStarted","Data":"4a5a7db6ddcad80da9ad974810b55f9283765251ea4d5be47d45844ea5e1c090"} Oct 11 04:06:55 crc kubenswrapper[4703]: I1011 04:06:55.285716 4703 generic.go:334] "Generic (PLEG): container finished" podID="43dd0973-48f7-42ac-b4a9-fa2373381562" containerID="58c609cb05b594e93c48d34ae50d448a1a5d0e0f0cfcb650ef45817d52e9b737" exitCode=0 Oct 11 04:06:55 crc kubenswrapper[4703]: I1011 04:06:55.285920 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590m778n" event={"ID":"43dd0973-48f7-42ac-b4a9-fa2373381562","Type":"ContainerDied","Data":"58c609cb05b594e93c48d34ae50d448a1a5d0e0f0cfcb650ef45817d52e9b737"} Oct 11 04:06:56 crc kubenswrapper[4703]: I1011 04:06:56.293005 4703 generic.go:334] "Generic (PLEG): container finished" podID="43dd0973-48f7-42ac-b4a9-fa2373381562" containerID="8b5a525c5a92bfed86120c4e21f13ec82aeaf48eb25e51dc048a4dde5b1288b3" exitCode=0 Oct 11 04:06:56 crc kubenswrapper[4703]: I1011 04:06:56.293047 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590m778n" event={"ID":"43dd0973-48f7-42ac-b4a9-fa2373381562","Type":"ContainerDied","Data":"8b5a525c5a92bfed86120c4e21f13ec82aeaf48eb25e51dc048a4dde5b1288b3"} Oct 11 04:06:57 crc kubenswrapper[4703]: I1011 04:06:57.164206 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-2862j" Oct 11 04:06:57 crc kubenswrapper[4703]: I1011 04:06:57.660874 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590m778n" Oct 11 04:06:57 crc kubenswrapper[4703]: I1011 04:06:57.703409 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wbgvx\" (UniqueName: \"kubernetes.io/projected/43dd0973-48f7-42ac-b4a9-fa2373381562-kube-api-access-wbgvx\") pod \"43dd0973-48f7-42ac-b4a9-fa2373381562\" (UID: \"43dd0973-48f7-42ac-b4a9-fa2373381562\") " Oct 11 04:06:57 crc kubenswrapper[4703]: I1011 04:06:57.703455 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/43dd0973-48f7-42ac-b4a9-fa2373381562-util\") pod \"43dd0973-48f7-42ac-b4a9-fa2373381562\" (UID: \"43dd0973-48f7-42ac-b4a9-fa2373381562\") " Oct 11 04:06:57 crc kubenswrapper[4703]: I1011 04:06:57.703534 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/43dd0973-48f7-42ac-b4a9-fa2373381562-bundle\") pod \"43dd0973-48f7-42ac-b4a9-fa2373381562\" (UID: \"43dd0973-48f7-42ac-b4a9-fa2373381562\") " Oct 11 04:06:57 crc kubenswrapper[4703]: I1011 04:06:57.704187 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/43dd0973-48f7-42ac-b4a9-fa2373381562-bundle" (OuterVolumeSpecName: "bundle") pod "43dd0973-48f7-42ac-b4a9-fa2373381562" (UID: "43dd0973-48f7-42ac-b4a9-fa2373381562"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 04:06:57 crc kubenswrapper[4703]: I1011 04:06:57.709331 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43dd0973-48f7-42ac-b4a9-fa2373381562-kube-api-access-wbgvx" (OuterVolumeSpecName: "kube-api-access-wbgvx") pod "43dd0973-48f7-42ac-b4a9-fa2373381562" (UID: "43dd0973-48f7-42ac-b4a9-fa2373381562"). InnerVolumeSpecName "kube-api-access-wbgvx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 04:06:57 crc kubenswrapper[4703]: I1011 04:06:57.712870 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/43dd0973-48f7-42ac-b4a9-fa2373381562-util" (OuterVolumeSpecName: "util") pod "43dd0973-48f7-42ac-b4a9-fa2373381562" (UID: "43dd0973-48f7-42ac-b4a9-fa2373381562"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 04:06:57 crc kubenswrapper[4703]: I1011 04:06:57.805631 4703 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/43dd0973-48f7-42ac-b4a9-fa2373381562-bundle\") on node \"crc\" DevicePath \"\"" Oct 11 04:06:57 crc kubenswrapper[4703]: I1011 04:06:57.805681 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wbgvx\" (UniqueName: \"kubernetes.io/projected/43dd0973-48f7-42ac-b4a9-fa2373381562-kube-api-access-wbgvx\") on node \"crc\" DevicePath \"\"" Oct 11 04:06:57 crc kubenswrapper[4703]: I1011 04:06:57.805695 4703 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/43dd0973-48f7-42ac-b4a9-fa2373381562-util\") on node \"crc\" DevicePath \"\"" Oct 11 04:06:58 crc kubenswrapper[4703]: I1011 04:06:58.306644 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590m778n" event={"ID":"43dd0973-48f7-42ac-b4a9-fa2373381562","Type":"ContainerDied","Data":"4a5a7db6ddcad80da9ad974810b55f9283765251ea4d5be47d45844ea5e1c090"} Oct 11 04:06:58 crc kubenswrapper[4703]: I1011 04:06:58.306696 4703 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4a5a7db6ddcad80da9ad974810b55f9283765251ea4d5be47d45844ea5e1c090" Oct 11 04:06:58 crc kubenswrapper[4703]: I1011 04:06:58.306744 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590m778n" Oct 11 04:07:00 crc kubenswrapper[4703]: I1011 04:07:00.460614 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-2862j"] Oct 11 04:07:00 crc kubenswrapper[4703]: I1011 04:07:00.461016 4703 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-2862j" podUID="2dbc731e-8bd9-4600-9b67-c88dddffe755" containerName="registry-server" containerID="cri-o://eff502a6b3336c9d6ef18eef39c0c701f051e55497886a4339e9a36ea186a23f" gracePeriod=2 Oct 11 04:07:01 crc kubenswrapper[4703]: I1011 04:07:01.332768 4703 generic.go:334] "Generic (PLEG): container finished" podID="2dbc731e-8bd9-4600-9b67-c88dddffe755" containerID="eff502a6b3336c9d6ef18eef39c0c701f051e55497886a4339e9a36ea186a23f" exitCode=0 Oct 11 04:07:01 crc kubenswrapper[4703]: I1011 04:07:01.333074 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2862j" event={"ID":"2dbc731e-8bd9-4600-9b67-c88dddffe755","Type":"ContainerDied","Data":"eff502a6b3336c9d6ef18eef39c0c701f051e55497886a4339e9a36ea186a23f"} Oct 11 04:07:01 crc kubenswrapper[4703]: I1011 04:07:01.408369 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2862j" Oct 11 04:07:01 crc kubenswrapper[4703]: I1011 04:07:01.458375 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2dbc731e-8bd9-4600-9b67-c88dddffe755-catalog-content\") pod \"2dbc731e-8bd9-4600-9b67-c88dddffe755\" (UID: \"2dbc731e-8bd9-4600-9b67-c88dddffe755\") " Oct 11 04:07:01 crc kubenswrapper[4703]: I1011 04:07:01.458432 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2dbc731e-8bd9-4600-9b67-c88dddffe755-utilities\") pod \"2dbc731e-8bd9-4600-9b67-c88dddffe755\" (UID: \"2dbc731e-8bd9-4600-9b67-c88dddffe755\") " Oct 11 04:07:01 crc kubenswrapper[4703]: I1011 04:07:01.458627 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2vqnd\" (UniqueName: \"kubernetes.io/projected/2dbc731e-8bd9-4600-9b67-c88dddffe755-kube-api-access-2vqnd\") pod \"2dbc731e-8bd9-4600-9b67-c88dddffe755\" (UID: \"2dbc731e-8bd9-4600-9b67-c88dddffe755\") " Oct 11 04:07:01 crc kubenswrapper[4703]: I1011 04:07:01.459307 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2dbc731e-8bd9-4600-9b67-c88dddffe755-utilities" (OuterVolumeSpecName: "utilities") pod "2dbc731e-8bd9-4600-9b67-c88dddffe755" (UID: "2dbc731e-8bd9-4600-9b67-c88dddffe755"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 04:07:01 crc kubenswrapper[4703]: I1011 04:07:01.464700 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2dbc731e-8bd9-4600-9b67-c88dddffe755-kube-api-access-2vqnd" (OuterVolumeSpecName: "kube-api-access-2vqnd") pod "2dbc731e-8bd9-4600-9b67-c88dddffe755" (UID: "2dbc731e-8bd9-4600-9b67-c88dddffe755"). InnerVolumeSpecName "kube-api-access-2vqnd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 04:07:01 crc kubenswrapper[4703]: I1011 04:07:01.509606 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2dbc731e-8bd9-4600-9b67-c88dddffe755-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2dbc731e-8bd9-4600-9b67-c88dddffe755" (UID: "2dbc731e-8bd9-4600-9b67-c88dddffe755"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 04:07:01 crc kubenswrapper[4703]: I1011 04:07:01.560191 4703 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2dbc731e-8bd9-4600-9b67-c88dddffe755-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 11 04:07:01 crc kubenswrapper[4703]: I1011 04:07:01.560226 4703 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2dbc731e-8bd9-4600-9b67-c88dddffe755-utilities\") on node \"crc\" DevicePath \"\"" Oct 11 04:07:01 crc kubenswrapper[4703]: I1011 04:07:01.560238 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2vqnd\" (UniqueName: \"kubernetes.io/projected/2dbc731e-8bd9-4600-9b67-c88dddffe755-kube-api-access-2vqnd\") on node \"crc\" DevicePath \"\"" Oct 11 04:07:02 crc kubenswrapper[4703]: I1011 04:07:02.341542 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2862j" event={"ID":"2dbc731e-8bd9-4600-9b67-c88dddffe755","Type":"ContainerDied","Data":"0aa70ca8b20be2b1f5a2094cf4a33975810fcfc35eda074781f7f60fc21a1441"} Oct 11 04:07:02 crc kubenswrapper[4703]: I1011 04:07:02.341589 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2862j" Oct 11 04:07:02 crc kubenswrapper[4703]: I1011 04:07:02.341596 4703 scope.go:117] "RemoveContainer" containerID="eff502a6b3336c9d6ef18eef39c0c701f051e55497886a4339e9a36ea186a23f" Oct 11 04:07:02 crc kubenswrapper[4703]: I1011 04:07:02.364513 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-2862j"] Oct 11 04:07:02 crc kubenswrapper[4703]: I1011 04:07:02.366291 4703 scope.go:117] "RemoveContainer" containerID="c40eb63a31943efdbba26f98d58e76910fb3df247ac34cba780b7c669b4a0487" Oct 11 04:07:02 crc kubenswrapper[4703]: I1011 04:07:02.370457 4703 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-2862j"] Oct 11 04:07:02 crc kubenswrapper[4703]: I1011 04:07:02.386021 4703 scope.go:117] "RemoveContainer" containerID="1564a70a35a2e476aca34bacd1701af24406029e671014d57ed894663ea18d79" Oct 11 04:07:03 crc kubenswrapper[4703]: I1011 04:07:03.019653 4703 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/openstack-galera-0" Oct 11 04:07:03 crc kubenswrapper[4703]: I1011 04:07:03.065995 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/openstack-galera-0" Oct 11 04:07:03 crc kubenswrapper[4703]: I1011 04:07:03.541626 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2dbc731e-8bd9-4600-9b67-c88dddffe755" path="/var/lib/kubelet/pods/2dbc731e-8bd9-4600-9b67-c88dddffe755/volumes" Oct 11 04:07:03 crc kubenswrapper[4703]: I1011 04:07:03.669065 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-wg2kn"] Oct 11 04:07:03 crc kubenswrapper[4703]: E1011 04:07:03.669296 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2dbc731e-8bd9-4600-9b67-c88dddffe755" containerName="extract-content" Oct 11 04:07:03 crc kubenswrapper[4703]: I1011 04:07:03.669307 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="2dbc731e-8bd9-4600-9b67-c88dddffe755" containerName="extract-content" Oct 11 04:07:03 crc kubenswrapper[4703]: E1011 04:07:03.669317 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43dd0973-48f7-42ac-b4a9-fa2373381562" containerName="pull" Oct 11 04:07:03 crc kubenswrapper[4703]: I1011 04:07:03.669324 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="43dd0973-48f7-42ac-b4a9-fa2373381562" containerName="pull" Oct 11 04:07:03 crc kubenswrapper[4703]: E1011 04:07:03.669333 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2dbc731e-8bd9-4600-9b67-c88dddffe755" containerName="registry-server" Oct 11 04:07:03 crc kubenswrapper[4703]: I1011 04:07:03.669339 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="2dbc731e-8bd9-4600-9b67-c88dddffe755" containerName="registry-server" Oct 11 04:07:03 crc kubenswrapper[4703]: E1011 04:07:03.669349 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2dbc731e-8bd9-4600-9b67-c88dddffe755" containerName="extract-utilities" Oct 11 04:07:03 crc kubenswrapper[4703]: I1011 04:07:03.669355 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="2dbc731e-8bd9-4600-9b67-c88dddffe755" containerName="extract-utilities" Oct 11 04:07:03 crc kubenswrapper[4703]: E1011 04:07:03.669361 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43dd0973-48f7-42ac-b4a9-fa2373381562" containerName="util" Oct 11 04:07:03 crc kubenswrapper[4703]: I1011 04:07:03.669367 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="43dd0973-48f7-42ac-b4a9-fa2373381562" containerName="util" Oct 11 04:07:03 crc kubenswrapper[4703]: E1011 04:07:03.669376 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43dd0973-48f7-42ac-b4a9-fa2373381562" containerName="extract" Oct 11 04:07:03 crc kubenswrapper[4703]: I1011 04:07:03.669381 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="43dd0973-48f7-42ac-b4a9-fa2373381562" containerName="extract" Oct 11 04:07:03 crc kubenswrapper[4703]: I1011 04:07:03.669489 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="43dd0973-48f7-42ac-b4a9-fa2373381562" containerName="extract" Oct 11 04:07:03 crc kubenswrapper[4703]: I1011 04:07:03.669501 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="2dbc731e-8bd9-4600-9b67-c88dddffe755" containerName="registry-server" Oct 11 04:07:03 crc kubenswrapper[4703]: I1011 04:07:03.670234 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wg2kn" Oct 11 04:07:03 crc kubenswrapper[4703]: I1011 04:07:03.688069 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c578ce60-c402-45aa-96cd-98ca13d25a89-utilities\") pod \"community-operators-wg2kn\" (UID: \"c578ce60-c402-45aa-96cd-98ca13d25a89\") " pod="openshift-marketplace/community-operators-wg2kn" Oct 11 04:07:03 crc kubenswrapper[4703]: I1011 04:07:03.688124 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c578ce60-c402-45aa-96cd-98ca13d25a89-catalog-content\") pod \"community-operators-wg2kn\" (UID: \"c578ce60-c402-45aa-96cd-98ca13d25a89\") " pod="openshift-marketplace/community-operators-wg2kn" Oct 11 04:07:03 crc kubenswrapper[4703]: I1011 04:07:03.688171 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pjf4j\" (UniqueName: \"kubernetes.io/projected/c578ce60-c402-45aa-96cd-98ca13d25a89-kube-api-access-pjf4j\") pod \"community-operators-wg2kn\" (UID: \"c578ce60-c402-45aa-96cd-98ca13d25a89\") " pod="openshift-marketplace/community-operators-wg2kn" Oct 11 04:07:03 crc kubenswrapper[4703]: I1011 04:07:03.700100 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wg2kn"] Oct 11 04:07:03 crc kubenswrapper[4703]: I1011 04:07:03.789117 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c578ce60-c402-45aa-96cd-98ca13d25a89-utilities\") pod \"community-operators-wg2kn\" (UID: \"c578ce60-c402-45aa-96cd-98ca13d25a89\") " pod="openshift-marketplace/community-operators-wg2kn" Oct 11 04:07:03 crc kubenswrapper[4703]: I1011 04:07:03.789512 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c578ce60-c402-45aa-96cd-98ca13d25a89-catalog-content\") pod \"community-operators-wg2kn\" (UID: \"c578ce60-c402-45aa-96cd-98ca13d25a89\") " pod="openshift-marketplace/community-operators-wg2kn" Oct 11 04:07:03 crc kubenswrapper[4703]: I1011 04:07:03.789581 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pjf4j\" (UniqueName: \"kubernetes.io/projected/c578ce60-c402-45aa-96cd-98ca13d25a89-kube-api-access-pjf4j\") pod \"community-operators-wg2kn\" (UID: \"c578ce60-c402-45aa-96cd-98ca13d25a89\") " pod="openshift-marketplace/community-operators-wg2kn" Oct 11 04:07:03 crc kubenswrapper[4703]: I1011 04:07:03.789669 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c578ce60-c402-45aa-96cd-98ca13d25a89-utilities\") pod \"community-operators-wg2kn\" (UID: \"c578ce60-c402-45aa-96cd-98ca13d25a89\") " pod="openshift-marketplace/community-operators-wg2kn" Oct 11 04:07:03 crc kubenswrapper[4703]: I1011 04:07:03.789887 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c578ce60-c402-45aa-96cd-98ca13d25a89-catalog-content\") pod \"community-operators-wg2kn\" (UID: \"c578ce60-c402-45aa-96cd-98ca13d25a89\") " pod="openshift-marketplace/community-operators-wg2kn" Oct 11 04:07:03 crc kubenswrapper[4703]: I1011 04:07:03.811648 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pjf4j\" (UniqueName: \"kubernetes.io/projected/c578ce60-c402-45aa-96cd-98ca13d25a89-kube-api-access-pjf4j\") pod \"community-operators-wg2kn\" (UID: \"c578ce60-c402-45aa-96cd-98ca13d25a89\") " pod="openshift-marketplace/community-operators-wg2kn" Oct 11 04:07:03 crc kubenswrapper[4703]: I1011 04:07:03.986045 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wg2kn" Oct 11 04:07:04 crc kubenswrapper[4703]: I1011 04:07:04.404432 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wg2kn"] Oct 11 04:07:04 crc kubenswrapper[4703]: W1011 04:07:04.410060 4703 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc578ce60_c402_45aa_96cd_98ca13d25a89.slice/crio-f154849aabbfa160a84cd9ba39625ceeb185319293ce0740bca0169c1e4af4aa WatchSource:0}: Error finding container f154849aabbfa160a84cd9ba39625ceeb185319293ce0740bca0169c1e4af4aa: Status 404 returned error can't find the container with id f154849aabbfa160a84cd9ba39625ceeb185319293ce0740bca0169c1e4af4aa Oct 11 04:07:04 crc kubenswrapper[4703]: I1011 04:07:04.574203 4703 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/openstack-galera-1" Oct 11 04:07:04 crc kubenswrapper[4703]: I1011 04:07:04.610728 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/openstack-galera-1" Oct 11 04:07:04 crc kubenswrapper[4703]: I1011 04:07:04.927727 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-779fc9694b-hwk9j"] Oct 11 04:07:04 crc kubenswrapper[4703]: I1011 04:07:04.928530 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-hwk9j" Oct 11 04:07:04 crc kubenswrapper[4703]: I1011 04:07:04.931027 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-dockercfg-s9jtr" Oct 11 04:07:04 crc kubenswrapper[4703]: I1011 04:07:04.931367 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-779fc9694b-hwk9j"] Oct 11 04:07:05 crc kubenswrapper[4703]: I1011 04:07:05.006723 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-98kvn\" (UniqueName: \"kubernetes.io/projected/97ee2afe-4504-4602-8708-09f8ccae07dc-kube-api-access-98kvn\") pod \"rabbitmq-cluster-operator-779fc9694b-hwk9j\" (UID: \"97ee2afe-4504-4602-8708-09f8ccae07dc\") " pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-hwk9j" Oct 11 04:07:05 crc kubenswrapper[4703]: I1011 04:07:05.108068 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-98kvn\" (UniqueName: \"kubernetes.io/projected/97ee2afe-4504-4602-8708-09f8ccae07dc-kube-api-access-98kvn\") pod \"rabbitmq-cluster-operator-779fc9694b-hwk9j\" (UID: \"97ee2afe-4504-4602-8708-09f8ccae07dc\") " pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-hwk9j" Oct 11 04:07:05 crc kubenswrapper[4703]: I1011 04:07:05.129690 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-98kvn\" (UniqueName: \"kubernetes.io/projected/97ee2afe-4504-4602-8708-09f8ccae07dc-kube-api-access-98kvn\") pod \"rabbitmq-cluster-operator-779fc9694b-hwk9j\" (UID: \"97ee2afe-4504-4602-8708-09f8ccae07dc\") " pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-hwk9j" Oct 11 04:07:05 crc kubenswrapper[4703]: I1011 04:07:05.273054 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-hwk9j" Oct 11 04:07:05 crc kubenswrapper[4703]: I1011 04:07:05.363155 4703 generic.go:334] "Generic (PLEG): container finished" podID="c578ce60-c402-45aa-96cd-98ca13d25a89" containerID="15522e0e68cf63ddf2a0ca7280a61650357fdc6eb895573c6ddd5843b271f5f3" exitCode=0 Oct 11 04:07:05 crc kubenswrapper[4703]: I1011 04:07:05.364781 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wg2kn" event={"ID":"c578ce60-c402-45aa-96cd-98ca13d25a89","Type":"ContainerDied","Data":"15522e0e68cf63ddf2a0ca7280a61650357fdc6eb895573c6ddd5843b271f5f3"} Oct 11 04:07:05 crc kubenswrapper[4703]: I1011 04:07:05.364833 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wg2kn" event={"ID":"c578ce60-c402-45aa-96cd-98ca13d25a89","Type":"ContainerStarted","Data":"f154849aabbfa160a84cd9ba39625ceeb185319293ce0740bca0169c1e4af4aa"} Oct 11 04:07:05 crc kubenswrapper[4703]: I1011 04:07:05.734164 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-779fc9694b-hwk9j"] Oct 11 04:07:06 crc kubenswrapper[4703]: I1011 04:07:06.370842 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wg2kn" event={"ID":"c578ce60-c402-45aa-96cd-98ca13d25a89","Type":"ContainerStarted","Data":"289deef96bbe31217f6314c34973f8ccc2325233be200468458f13ecf279eb4a"} Oct 11 04:07:06 crc kubenswrapper[4703]: I1011 04:07:06.373197 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-hwk9j" event={"ID":"97ee2afe-4504-4602-8708-09f8ccae07dc","Type":"ContainerStarted","Data":"a30bda311201403cad4aa6fc2e5c4d15d6425e09c3cae485e0796a00e75d2b75"} Oct 11 04:07:07 crc kubenswrapper[4703]: I1011 04:07:07.388108 4703 generic.go:334] "Generic (PLEG): container finished" podID="c578ce60-c402-45aa-96cd-98ca13d25a89" containerID="289deef96bbe31217f6314c34973f8ccc2325233be200468458f13ecf279eb4a" exitCode=0 Oct 11 04:07:07 crc kubenswrapper[4703]: I1011 04:07:07.388334 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wg2kn" event={"ID":"c578ce60-c402-45aa-96cd-98ca13d25a89","Type":"ContainerDied","Data":"289deef96bbe31217f6314c34973f8ccc2325233be200468458f13ecf279eb4a"} Oct 11 04:07:08 crc kubenswrapper[4703]: I1011 04:07:08.401232 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wg2kn" event={"ID":"c578ce60-c402-45aa-96cd-98ca13d25a89","Type":"ContainerStarted","Data":"5868bfd123afff277628a46bff60e113d52b40821090eefb22c659718b86716c"} Oct 11 04:07:08 crc kubenswrapper[4703]: I1011 04:07:08.402885 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-hwk9j" event={"ID":"97ee2afe-4504-4602-8708-09f8ccae07dc","Type":"ContainerStarted","Data":"5baa90119f586356d140df7f3f3654c98215c9e5b5d30b8b42a78320b648b160"} Oct 11 04:07:08 crc kubenswrapper[4703]: I1011 04:07:08.428654 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-wg2kn" podStartSLOduration=3.021155459 podStartE2EDuration="5.428622413s" podCreationTimestamp="2025-10-11 04:07:03 +0000 UTC" firstStartedPulling="2025-10-11 04:07:05.36637187 +0000 UTC m=+716.576853812" lastFinishedPulling="2025-10-11 04:07:07.773838804 +0000 UTC m=+718.984320766" observedRunningTime="2025-10-11 04:07:08.420077238 +0000 UTC m=+719.630559170" watchObservedRunningTime="2025-10-11 04:07:08.428622413 +0000 UTC m=+719.639104385" Oct 11 04:07:08 crc kubenswrapper[4703]: I1011 04:07:08.441529 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-hwk9j" podStartSLOduration=2.926692315 podStartE2EDuration="4.441503362s" podCreationTimestamp="2025-10-11 04:07:04 +0000 UTC" firstStartedPulling="2025-10-11 04:07:05.740239628 +0000 UTC m=+716.950721560" lastFinishedPulling="2025-10-11 04:07:07.255050685 +0000 UTC m=+718.465532607" observedRunningTime="2025-10-11 04:07:08.434317373 +0000 UTC m=+719.644799305" watchObservedRunningTime="2025-10-11 04:07:08.441503362 +0000 UTC m=+719.651985324" Oct 11 04:07:09 crc kubenswrapper[4703]: I1011 04:07:09.877226 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-r99fj"] Oct 11 04:07:09 crc kubenswrapper[4703]: I1011 04:07:09.880101 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-r99fj" Oct 11 04:07:09 crc kubenswrapper[4703]: I1011 04:07:09.895514 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-r99fj"] Oct 11 04:07:09 crc kubenswrapper[4703]: I1011 04:07:09.987495 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rckmf\" (UniqueName: \"kubernetes.io/projected/217ecb68-1935-4fe1-8127-2fc9e2305f16-kube-api-access-rckmf\") pod \"redhat-operators-r99fj\" (UID: \"217ecb68-1935-4fe1-8127-2fc9e2305f16\") " pod="openshift-marketplace/redhat-operators-r99fj" Oct 11 04:07:09 crc kubenswrapper[4703]: I1011 04:07:09.987545 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/217ecb68-1935-4fe1-8127-2fc9e2305f16-catalog-content\") pod \"redhat-operators-r99fj\" (UID: \"217ecb68-1935-4fe1-8127-2fc9e2305f16\") " pod="openshift-marketplace/redhat-operators-r99fj" Oct 11 04:07:09 crc kubenswrapper[4703]: I1011 04:07:09.987567 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/217ecb68-1935-4fe1-8127-2fc9e2305f16-utilities\") pod \"redhat-operators-r99fj\" (UID: \"217ecb68-1935-4fe1-8127-2fc9e2305f16\") " pod="openshift-marketplace/redhat-operators-r99fj" Oct 11 04:07:10 crc kubenswrapper[4703]: I1011 04:07:10.088763 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rckmf\" (UniqueName: \"kubernetes.io/projected/217ecb68-1935-4fe1-8127-2fc9e2305f16-kube-api-access-rckmf\") pod \"redhat-operators-r99fj\" (UID: \"217ecb68-1935-4fe1-8127-2fc9e2305f16\") " pod="openshift-marketplace/redhat-operators-r99fj" Oct 11 04:07:10 crc kubenswrapper[4703]: I1011 04:07:10.088831 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/217ecb68-1935-4fe1-8127-2fc9e2305f16-catalog-content\") pod \"redhat-operators-r99fj\" (UID: \"217ecb68-1935-4fe1-8127-2fc9e2305f16\") " pod="openshift-marketplace/redhat-operators-r99fj" Oct 11 04:07:10 crc kubenswrapper[4703]: I1011 04:07:10.088860 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/217ecb68-1935-4fe1-8127-2fc9e2305f16-utilities\") pod \"redhat-operators-r99fj\" (UID: \"217ecb68-1935-4fe1-8127-2fc9e2305f16\") " pod="openshift-marketplace/redhat-operators-r99fj" Oct 11 04:07:10 crc kubenswrapper[4703]: I1011 04:07:10.089359 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/217ecb68-1935-4fe1-8127-2fc9e2305f16-utilities\") pod \"redhat-operators-r99fj\" (UID: \"217ecb68-1935-4fe1-8127-2fc9e2305f16\") " pod="openshift-marketplace/redhat-operators-r99fj" Oct 11 04:07:10 crc kubenswrapper[4703]: I1011 04:07:10.090160 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/217ecb68-1935-4fe1-8127-2fc9e2305f16-catalog-content\") pod \"redhat-operators-r99fj\" (UID: \"217ecb68-1935-4fe1-8127-2fc9e2305f16\") " pod="openshift-marketplace/redhat-operators-r99fj" Oct 11 04:07:10 crc kubenswrapper[4703]: I1011 04:07:10.110591 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rckmf\" (UniqueName: \"kubernetes.io/projected/217ecb68-1935-4fe1-8127-2fc9e2305f16-kube-api-access-rckmf\") pod \"redhat-operators-r99fj\" (UID: \"217ecb68-1935-4fe1-8127-2fc9e2305f16\") " pod="openshift-marketplace/redhat-operators-r99fj" Oct 11 04:07:10 crc kubenswrapper[4703]: I1011 04:07:10.198455 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-r99fj" Oct 11 04:07:10 crc kubenswrapper[4703]: I1011 04:07:10.610238 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-r99fj"] Oct 11 04:07:10 crc kubenswrapper[4703]: W1011 04:07:10.614128 4703 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod217ecb68_1935_4fe1_8127_2fc9e2305f16.slice/crio-bc46b87ca0194ca44d4142c2b839443d0638c4410264a944ac37f46699a2772b WatchSource:0}: Error finding container bc46b87ca0194ca44d4142c2b839443d0638c4410264a944ac37f46699a2772b: Status 404 returned error can't find the container with id bc46b87ca0194ca44d4142c2b839443d0638c4410264a944ac37f46699a2772b Oct 11 04:07:11 crc kubenswrapper[4703]: I1011 04:07:11.421328 4703 generic.go:334] "Generic (PLEG): container finished" podID="217ecb68-1935-4fe1-8127-2fc9e2305f16" containerID="aaf8aa77c63689d03b280409f3d98f5289cb881aecf51de89f84a12ee2a48fc0" exitCode=0 Oct 11 04:07:11 crc kubenswrapper[4703]: I1011 04:07:11.421386 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r99fj" event={"ID":"217ecb68-1935-4fe1-8127-2fc9e2305f16","Type":"ContainerDied","Data":"aaf8aa77c63689d03b280409f3d98f5289cb881aecf51de89f84a12ee2a48fc0"} Oct 11 04:07:11 crc kubenswrapper[4703]: I1011 04:07:11.421624 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r99fj" event={"ID":"217ecb68-1935-4fe1-8127-2fc9e2305f16","Type":"ContainerStarted","Data":"bc46b87ca0194ca44d4142c2b839443d0638c4410264a944ac37f46699a2772b"} Oct 11 04:07:12 crc kubenswrapper[4703]: I1011 04:07:12.087279 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/rabbitmq-server-0"] Oct 11 04:07:12 crc kubenswrapper[4703]: I1011 04:07:12.090608 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/rabbitmq-server-0" Oct 11 04:07:12 crc kubenswrapper[4703]: I1011 04:07:12.092964 4703 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"rabbitmq-server-dockercfg-w7lj2" Oct 11 04:07:12 crc kubenswrapper[4703]: I1011 04:07:12.093049 4703 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"rabbitmq-default-user" Oct 11 04:07:12 crc kubenswrapper[4703]: I1011 04:07:12.093062 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"glance-kuttl-tests"/"rabbitmq-plugins-conf" Oct 11 04:07:12 crc kubenswrapper[4703]: I1011 04:07:12.093074 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"glance-kuttl-tests"/"rabbitmq-server-conf" Oct 11 04:07:12 crc kubenswrapper[4703]: I1011 04:07:12.093099 4703 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"rabbitmq-erlang-cookie" Oct 11 04:07:12 crc kubenswrapper[4703]: I1011 04:07:12.104609 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/rabbitmq-server-0"] Oct 11 04:07:12 crc kubenswrapper[4703]: I1011 04:07:12.118104 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8mbzs\" (UniqueName: \"kubernetes.io/projected/ae0ef79e-e4aa-49b4-a412-f56b3e90c4a8-kube-api-access-8mbzs\") pod \"rabbitmq-server-0\" (UID: \"ae0ef79e-e4aa-49b4-a412-f56b3e90c4a8\") " pod="glance-kuttl-tests/rabbitmq-server-0" Oct 11 04:07:12 crc kubenswrapper[4703]: I1011 04:07:12.118169 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ae0ef79e-e4aa-49b4-a412-f56b3e90c4a8-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"ae0ef79e-e4aa-49b4-a412-f56b3e90c4a8\") " pod="glance-kuttl-tests/rabbitmq-server-0" Oct 11 04:07:12 crc kubenswrapper[4703]: I1011 04:07:12.118237 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ae0ef79e-e4aa-49b4-a412-f56b3e90c4a8-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"ae0ef79e-e4aa-49b4-a412-f56b3e90c4a8\") " pod="glance-kuttl-tests/rabbitmq-server-0" Oct 11 04:07:12 crc kubenswrapper[4703]: I1011 04:07:12.118260 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ae0ef79e-e4aa-49b4-a412-f56b3e90c4a8-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"ae0ef79e-e4aa-49b4-a412-f56b3e90c4a8\") " pod="glance-kuttl-tests/rabbitmq-server-0" Oct 11 04:07:12 crc kubenswrapper[4703]: I1011 04:07:12.118307 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ae0ef79e-e4aa-49b4-a412-f56b3e90c4a8-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"ae0ef79e-e4aa-49b4-a412-f56b3e90c4a8\") " pod="glance-kuttl-tests/rabbitmq-server-0" Oct 11 04:07:12 crc kubenswrapper[4703]: I1011 04:07:12.118335 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-b95eef69-af56-469f-8d26-aeb8dbdf9a96\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b95eef69-af56-469f-8d26-aeb8dbdf9a96\") pod \"rabbitmq-server-0\" (UID: \"ae0ef79e-e4aa-49b4-a412-f56b3e90c4a8\") " pod="glance-kuttl-tests/rabbitmq-server-0" Oct 11 04:07:12 crc kubenswrapper[4703]: I1011 04:07:12.118392 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ae0ef79e-e4aa-49b4-a412-f56b3e90c4a8-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"ae0ef79e-e4aa-49b4-a412-f56b3e90c4a8\") " pod="glance-kuttl-tests/rabbitmq-server-0" Oct 11 04:07:12 crc kubenswrapper[4703]: I1011 04:07:12.118411 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ae0ef79e-e4aa-49b4-a412-f56b3e90c4a8-pod-info\") pod \"rabbitmq-server-0\" (UID: \"ae0ef79e-e4aa-49b4-a412-f56b3e90c4a8\") " pod="glance-kuttl-tests/rabbitmq-server-0" Oct 11 04:07:12 crc kubenswrapper[4703]: I1011 04:07:12.219876 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ae0ef79e-e4aa-49b4-a412-f56b3e90c4a8-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"ae0ef79e-e4aa-49b4-a412-f56b3e90c4a8\") " pod="glance-kuttl-tests/rabbitmq-server-0" Oct 11 04:07:12 crc kubenswrapper[4703]: I1011 04:07:12.219935 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ae0ef79e-e4aa-49b4-a412-f56b3e90c4a8-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"ae0ef79e-e4aa-49b4-a412-f56b3e90c4a8\") " pod="glance-kuttl-tests/rabbitmq-server-0" Oct 11 04:07:12 crc kubenswrapper[4703]: I1011 04:07:12.219970 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ae0ef79e-e4aa-49b4-a412-f56b3e90c4a8-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"ae0ef79e-e4aa-49b4-a412-f56b3e90c4a8\") " pod="glance-kuttl-tests/rabbitmq-server-0" Oct 11 04:07:12 crc kubenswrapper[4703]: I1011 04:07:12.220004 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-b95eef69-af56-469f-8d26-aeb8dbdf9a96\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b95eef69-af56-469f-8d26-aeb8dbdf9a96\") pod \"rabbitmq-server-0\" (UID: \"ae0ef79e-e4aa-49b4-a412-f56b3e90c4a8\") " pod="glance-kuttl-tests/rabbitmq-server-0" Oct 11 04:07:12 crc kubenswrapper[4703]: I1011 04:07:12.220066 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ae0ef79e-e4aa-49b4-a412-f56b3e90c4a8-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"ae0ef79e-e4aa-49b4-a412-f56b3e90c4a8\") " pod="glance-kuttl-tests/rabbitmq-server-0" Oct 11 04:07:12 crc kubenswrapper[4703]: I1011 04:07:12.220086 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ae0ef79e-e4aa-49b4-a412-f56b3e90c4a8-pod-info\") pod \"rabbitmq-server-0\" (UID: \"ae0ef79e-e4aa-49b4-a412-f56b3e90c4a8\") " pod="glance-kuttl-tests/rabbitmq-server-0" Oct 11 04:07:12 crc kubenswrapper[4703]: I1011 04:07:12.220115 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8mbzs\" (UniqueName: \"kubernetes.io/projected/ae0ef79e-e4aa-49b4-a412-f56b3e90c4a8-kube-api-access-8mbzs\") pod \"rabbitmq-server-0\" (UID: \"ae0ef79e-e4aa-49b4-a412-f56b3e90c4a8\") " pod="glance-kuttl-tests/rabbitmq-server-0" Oct 11 04:07:12 crc kubenswrapper[4703]: I1011 04:07:12.220149 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ae0ef79e-e4aa-49b4-a412-f56b3e90c4a8-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"ae0ef79e-e4aa-49b4-a412-f56b3e90c4a8\") " pod="glance-kuttl-tests/rabbitmq-server-0" Oct 11 04:07:12 crc kubenswrapper[4703]: I1011 04:07:12.221058 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ae0ef79e-e4aa-49b4-a412-f56b3e90c4a8-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"ae0ef79e-e4aa-49b4-a412-f56b3e90c4a8\") " pod="glance-kuttl-tests/rabbitmq-server-0" Oct 11 04:07:12 crc kubenswrapper[4703]: I1011 04:07:12.221293 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ae0ef79e-e4aa-49b4-a412-f56b3e90c4a8-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"ae0ef79e-e4aa-49b4-a412-f56b3e90c4a8\") " pod="glance-kuttl-tests/rabbitmq-server-0" Oct 11 04:07:12 crc kubenswrapper[4703]: I1011 04:07:12.221301 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ae0ef79e-e4aa-49b4-a412-f56b3e90c4a8-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"ae0ef79e-e4aa-49b4-a412-f56b3e90c4a8\") " pod="glance-kuttl-tests/rabbitmq-server-0" Oct 11 04:07:12 crc kubenswrapper[4703]: I1011 04:07:12.223792 4703 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 11 04:07:12 crc kubenswrapper[4703]: I1011 04:07:12.223871 4703 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-b95eef69-af56-469f-8d26-aeb8dbdf9a96\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b95eef69-af56-469f-8d26-aeb8dbdf9a96\") pod \"rabbitmq-server-0\" (UID: \"ae0ef79e-e4aa-49b4-a412-f56b3e90c4a8\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/f45a7add5cfaa10dd0bb3c9087b211b3ebc22223b66a566264d93f1c2211c956/globalmount\"" pod="glance-kuttl-tests/rabbitmq-server-0" Oct 11 04:07:12 crc kubenswrapper[4703]: I1011 04:07:12.226937 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ae0ef79e-e4aa-49b4-a412-f56b3e90c4a8-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"ae0ef79e-e4aa-49b4-a412-f56b3e90c4a8\") " pod="glance-kuttl-tests/rabbitmq-server-0" Oct 11 04:07:12 crc kubenswrapper[4703]: I1011 04:07:12.226963 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ae0ef79e-e4aa-49b4-a412-f56b3e90c4a8-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"ae0ef79e-e4aa-49b4-a412-f56b3e90c4a8\") " pod="glance-kuttl-tests/rabbitmq-server-0" Oct 11 04:07:12 crc kubenswrapper[4703]: I1011 04:07:12.237976 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8mbzs\" (UniqueName: \"kubernetes.io/projected/ae0ef79e-e4aa-49b4-a412-f56b3e90c4a8-kube-api-access-8mbzs\") pod \"rabbitmq-server-0\" (UID: \"ae0ef79e-e4aa-49b4-a412-f56b3e90c4a8\") " pod="glance-kuttl-tests/rabbitmq-server-0" Oct 11 04:07:12 crc kubenswrapper[4703]: I1011 04:07:12.240669 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ae0ef79e-e4aa-49b4-a412-f56b3e90c4a8-pod-info\") pod \"rabbitmq-server-0\" (UID: \"ae0ef79e-e4aa-49b4-a412-f56b3e90c4a8\") " pod="glance-kuttl-tests/rabbitmq-server-0" Oct 11 04:07:12 crc kubenswrapper[4703]: I1011 04:07:12.255682 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-b95eef69-af56-469f-8d26-aeb8dbdf9a96\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b95eef69-af56-469f-8d26-aeb8dbdf9a96\") pod \"rabbitmq-server-0\" (UID: \"ae0ef79e-e4aa-49b4-a412-f56b3e90c4a8\") " pod="glance-kuttl-tests/rabbitmq-server-0" Oct 11 04:07:12 crc kubenswrapper[4703]: I1011 04:07:12.429271 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r99fj" event={"ID":"217ecb68-1935-4fe1-8127-2fc9e2305f16","Type":"ContainerStarted","Data":"0f4b1354d7cddbde6fccdd242e4e8ad4cf5a943622b16309d332f0d76caf560d"} Oct 11 04:07:12 crc kubenswrapper[4703]: I1011 04:07:12.456737 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/rabbitmq-server-0" Oct 11 04:07:12 crc kubenswrapper[4703]: I1011 04:07:12.917568 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/rabbitmq-server-0"] Oct 11 04:07:13 crc kubenswrapper[4703]: I1011 04:07:13.441329 4703 generic.go:334] "Generic (PLEG): container finished" podID="217ecb68-1935-4fe1-8127-2fc9e2305f16" containerID="0f4b1354d7cddbde6fccdd242e4e8ad4cf5a943622b16309d332f0d76caf560d" exitCode=0 Oct 11 04:07:13 crc kubenswrapper[4703]: I1011 04:07:13.441544 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r99fj" event={"ID":"217ecb68-1935-4fe1-8127-2fc9e2305f16","Type":"ContainerDied","Data":"0f4b1354d7cddbde6fccdd242e4e8ad4cf5a943622b16309d332f0d76caf560d"} Oct 11 04:07:13 crc kubenswrapper[4703]: I1011 04:07:13.443743 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/rabbitmq-server-0" event={"ID":"ae0ef79e-e4aa-49b4-a412-f56b3e90c4a8","Type":"ContainerStarted","Data":"67cac6e3c9e389687c970a0585a0401488b6e25925a57049f0fcd033d0fbc6dd"} Oct 11 04:07:13 crc kubenswrapper[4703]: I1011 04:07:13.986972 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-wg2kn" Oct 11 04:07:13 crc kubenswrapper[4703]: I1011 04:07:13.987335 4703 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-wg2kn" Oct 11 04:07:14 crc kubenswrapper[4703]: I1011 04:07:14.038824 4703 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-wg2kn" Oct 11 04:07:14 crc kubenswrapper[4703]: I1011 04:07:14.456811 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r99fj" event={"ID":"217ecb68-1935-4fe1-8127-2fc9e2305f16","Type":"ContainerStarted","Data":"04b0cf647aa29bbbc4650c1f6ede201f3e30bf3fb6ceeb7d96f4f8f03e30aac4"} Oct 11 04:07:14 crc kubenswrapper[4703]: I1011 04:07:14.481107 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-r99fj" podStartSLOduration=3.047358769 podStartE2EDuration="5.481086534s" podCreationTimestamp="2025-10-11 04:07:09 +0000 UTC" firstStartedPulling="2025-10-11 04:07:11.423242957 +0000 UTC m=+722.633724889" lastFinishedPulling="2025-10-11 04:07:13.856970722 +0000 UTC m=+725.067452654" observedRunningTime="2025-10-11 04:07:14.478790333 +0000 UTC m=+725.689272255" watchObservedRunningTime="2025-10-11 04:07:14.481086534 +0000 UTC m=+725.691568456" Oct 11 04:07:14 crc kubenswrapper[4703]: I1011 04:07:14.504111 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-wg2kn" Oct 11 04:07:17 crc kubenswrapper[4703]: I1011 04:07:17.865863 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-index-6r6dw"] Oct 11 04:07:17 crc kubenswrapper[4703]: I1011 04:07:17.866815 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-index-6r6dw" Oct 11 04:07:17 crc kubenswrapper[4703]: I1011 04:07:17.870226 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-index-dockercfg-bbmdm" Oct 11 04:07:17 crc kubenswrapper[4703]: I1011 04:07:17.876504 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-index-6r6dw"] Oct 11 04:07:17 crc kubenswrapper[4703]: I1011 04:07:17.920361 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dg744\" (UniqueName: \"kubernetes.io/projected/2768ed32-fa2e-42a9-b5d3-be0483d294a8-kube-api-access-dg744\") pod \"keystone-operator-index-6r6dw\" (UID: \"2768ed32-fa2e-42a9-b5d3-be0483d294a8\") " pod="openstack-operators/keystone-operator-index-6r6dw" Oct 11 04:07:18 crc kubenswrapper[4703]: I1011 04:07:18.021697 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dg744\" (UniqueName: \"kubernetes.io/projected/2768ed32-fa2e-42a9-b5d3-be0483d294a8-kube-api-access-dg744\") pod \"keystone-operator-index-6r6dw\" (UID: \"2768ed32-fa2e-42a9-b5d3-be0483d294a8\") " pod="openstack-operators/keystone-operator-index-6r6dw" Oct 11 04:07:18 crc kubenswrapper[4703]: I1011 04:07:18.044010 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dg744\" (UniqueName: \"kubernetes.io/projected/2768ed32-fa2e-42a9-b5d3-be0483d294a8-kube-api-access-dg744\") pod \"keystone-operator-index-6r6dw\" (UID: \"2768ed32-fa2e-42a9-b5d3-be0483d294a8\") " pod="openstack-operators/keystone-operator-index-6r6dw" Oct 11 04:07:18 crc kubenswrapper[4703]: I1011 04:07:18.190246 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-index-6r6dw" Oct 11 04:07:19 crc kubenswrapper[4703]: W1011 04:07:19.626927 4703 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2768ed32_fa2e_42a9_b5d3_be0483d294a8.slice/crio-52429af56d355980c7dd356372148a694fae133f798f5607121d97e626bc6af7 WatchSource:0}: Error finding container 52429af56d355980c7dd356372148a694fae133f798f5607121d97e626bc6af7: Status 404 returned error can't find the container with id 52429af56d355980c7dd356372148a694fae133f798f5607121d97e626bc6af7 Oct 11 04:07:19 crc kubenswrapper[4703]: I1011 04:07:19.627229 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-index-6r6dw"] Oct 11 04:07:20 crc kubenswrapper[4703]: I1011 04:07:20.199098 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-r99fj" Oct 11 04:07:20 crc kubenswrapper[4703]: I1011 04:07:20.199566 4703 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-r99fj" Oct 11 04:07:20 crc kubenswrapper[4703]: I1011 04:07:20.255989 4703 patch_prober.go:28] interesting pod/machine-config-daemon-6b7d5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 11 04:07:20 crc kubenswrapper[4703]: I1011 04:07:20.256365 4703 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6b7d5" podUID="74f832fc-6791-47d6-a9b3-07d923e053dc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 11 04:07:20 crc kubenswrapper[4703]: I1011 04:07:20.262013 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-wg2kn"] Oct 11 04:07:20 crc kubenswrapper[4703]: I1011 04:07:20.262291 4703 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-wg2kn" podUID="c578ce60-c402-45aa-96cd-98ca13d25a89" containerName="registry-server" containerID="cri-o://5868bfd123afff277628a46bff60e113d52b40821090eefb22c659718b86716c" gracePeriod=2 Oct 11 04:07:20 crc kubenswrapper[4703]: I1011 04:07:20.283823 4703 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-r99fj" Oct 11 04:07:20 crc kubenswrapper[4703]: I1011 04:07:20.512918 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-index-6r6dw" event={"ID":"2768ed32-fa2e-42a9-b5d3-be0483d294a8","Type":"ContainerStarted","Data":"52429af56d355980c7dd356372148a694fae133f798f5607121d97e626bc6af7"} Oct 11 04:07:20 crc kubenswrapper[4703]: I1011 04:07:20.572606 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-r99fj" Oct 11 04:07:20 crc kubenswrapper[4703]: I1011 04:07:20.922638 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wg2kn" Oct 11 04:07:20 crc kubenswrapper[4703]: I1011 04:07:20.982503 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjf4j\" (UniqueName: \"kubernetes.io/projected/c578ce60-c402-45aa-96cd-98ca13d25a89-kube-api-access-pjf4j\") pod \"c578ce60-c402-45aa-96cd-98ca13d25a89\" (UID: \"c578ce60-c402-45aa-96cd-98ca13d25a89\") " Oct 11 04:07:20 crc kubenswrapper[4703]: I1011 04:07:20.982548 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c578ce60-c402-45aa-96cd-98ca13d25a89-catalog-content\") pod \"c578ce60-c402-45aa-96cd-98ca13d25a89\" (UID: \"c578ce60-c402-45aa-96cd-98ca13d25a89\") " Oct 11 04:07:20 crc kubenswrapper[4703]: I1011 04:07:20.982672 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c578ce60-c402-45aa-96cd-98ca13d25a89-utilities\") pod \"c578ce60-c402-45aa-96cd-98ca13d25a89\" (UID: \"c578ce60-c402-45aa-96cd-98ca13d25a89\") " Oct 11 04:07:20 crc kubenswrapper[4703]: I1011 04:07:20.983379 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c578ce60-c402-45aa-96cd-98ca13d25a89-utilities" (OuterVolumeSpecName: "utilities") pod "c578ce60-c402-45aa-96cd-98ca13d25a89" (UID: "c578ce60-c402-45aa-96cd-98ca13d25a89"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 04:07:20 crc kubenswrapper[4703]: I1011 04:07:20.987624 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c578ce60-c402-45aa-96cd-98ca13d25a89-kube-api-access-pjf4j" (OuterVolumeSpecName: "kube-api-access-pjf4j") pod "c578ce60-c402-45aa-96cd-98ca13d25a89" (UID: "c578ce60-c402-45aa-96cd-98ca13d25a89"). InnerVolumeSpecName "kube-api-access-pjf4j". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 04:07:21 crc kubenswrapper[4703]: I1011 04:07:21.039547 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c578ce60-c402-45aa-96cd-98ca13d25a89-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c578ce60-c402-45aa-96cd-98ca13d25a89" (UID: "c578ce60-c402-45aa-96cd-98ca13d25a89"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 04:07:21 crc kubenswrapper[4703]: I1011 04:07:21.084152 4703 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c578ce60-c402-45aa-96cd-98ca13d25a89-utilities\") on node \"crc\" DevicePath \"\"" Oct 11 04:07:21 crc kubenswrapper[4703]: I1011 04:07:21.084196 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjf4j\" (UniqueName: \"kubernetes.io/projected/c578ce60-c402-45aa-96cd-98ca13d25a89-kube-api-access-pjf4j\") on node \"crc\" DevicePath \"\"" Oct 11 04:07:21 crc kubenswrapper[4703]: I1011 04:07:21.084210 4703 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c578ce60-c402-45aa-96cd-98ca13d25a89-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 11 04:07:21 crc kubenswrapper[4703]: I1011 04:07:21.526939 4703 generic.go:334] "Generic (PLEG): container finished" podID="c578ce60-c402-45aa-96cd-98ca13d25a89" containerID="5868bfd123afff277628a46bff60e113d52b40821090eefb22c659718b86716c" exitCode=0 Oct 11 04:07:21 crc kubenswrapper[4703]: I1011 04:07:21.527069 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wg2kn" event={"ID":"c578ce60-c402-45aa-96cd-98ca13d25a89","Type":"ContainerDied","Data":"5868bfd123afff277628a46bff60e113d52b40821090eefb22c659718b86716c"} Oct 11 04:07:21 crc kubenswrapper[4703]: I1011 04:07:21.527179 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wg2kn" event={"ID":"c578ce60-c402-45aa-96cd-98ca13d25a89","Type":"ContainerDied","Data":"f154849aabbfa160a84cd9ba39625ceeb185319293ce0740bca0169c1e4af4aa"} Oct 11 04:07:21 crc kubenswrapper[4703]: I1011 04:07:21.527236 4703 scope.go:117] "RemoveContainer" containerID="5868bfd123afff277628a46bff60e113d52b40821090eefb22c659718b86716c" Oct 11 04:07:21 crc kubenswrapper[4703]: I1011 04:07:21.527451 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wg2kn" Oct 11 04:07:21 crc kubenswrapper[4703]: I1011 04:07:21.581632 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-wg2kn"] Oct 11 04:07:21 crc kubenswrapper[4703]: I1011 04:07:21.588937 4703 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-wg2kn"] Oct 11 04:07:22 crc kubenswrapper[4703]: I1011 04:07:22.218003 4703 scope.go:117] "RemoveContainer" containerID="289deef96bbe31217f6314c34973f8ccc2325233be200468458f13ecf279eb4a" Oct 11 04:07:22 crc kubenswrapper[4703]: I1011 04:07:22.239552 4703 scope.go:117] "RemoveContainer" containerID="15522e0e68cf63ddf2a0ca7280a61650357fdc6eb895573c6ddd5843b271f5f3" Oct 11 04:07:22 crc kubenswrapper[4703]: I1011 04:07:22.267938 4703 scope.go:117] "RemoveContainer" containerID="5868bfd123afff277628a46bff60e113d52b40821090eefb22c659718b86716c" Oct 11 04:07:22 crc kubenswrapper[4703]: E1011 04:07:22.268384 4703 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5868bfd123afff277628a46bff60e113d52b40821090eefb22c659718b86716c\": container with ID starting with 5868bfd123afff277628a46bff60e113d52b40821090eefb22c659718b86716c not found: ID does not exist" containerID="5868bfd123afff277628a46bff60e113d52b40821090eefb22c659718b86716c" Oct 11 04:07:22 crc kubenswrapper[4703]: I1011 04:07:22.268444 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5868bfd123afff277628a46bff60e113d52b40821090eefb22c659718b86716c"} err="failed to get container status \"5868bfd123afff277628a46bff60e113d52b40821090eefb22c659718b86716c\": rpc error: code = NotFound desc = could not find container \"5868bfd123afff277628a46bff60e113d52b40821090eefb22c659718b86716c\": container with ID starting with 5868bfd123afff277628a46bff60e113d52b40821090eefb22c659718b86716c not found: ID does not exist" Oct 11 04:07:22 crc kubenswrapper[4703]: I1011 04:07:22.268496 4703 scope.go:117] "RemoveContainer" containerID="289deef96bbe31217f6314c34973f8ccc2325233be200468458f13ecf279eb4a" Oct 11 04:07:22 crc kubenswrapper[4703]: E1011 04:07:22.268803 4703 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"289deef96bbe31217f6314c34973f8ccc2325233be200468458f13ecf279eb4a\": container with ID starting with 289deef96bbe31217f6314c34973f8ccc2325233be200468458f13ecf279eb4a not found: ID does not exist" containerID="289deef96bbe31217f6314c34973f8ccc2325233be200468458f13ecf279eb4a" Oct 11 04:07:22 crc kubenswrapper[4703]: I1011 04:07:22.268840 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"289deef96bbe31217f6314c34973f8ccc2325233be200468458f13ecf279eb4a"} err="failed to get container status \"289deef96bbe31217f6314c34973f8ccc2325233be200468458f13ecf279eb4a\": rpc error: code = NotFound desc = could not find container \"289deef96bbe31217f6314c34973f8ccc2325233be200468458f13ecf279eb4a\": container with ID starting with 289deef96bbe31217f6314c34973f8ccc2325233be200468458f13ecf279eb4a not found: ID does not exist" Oct 11 04:07:22 crc kubenswrapper[4703]: I1011 04:07:22.268863 4703 scope.go:117] "RemoveContainer" containerID="15522e0e68cf63ddf2a0ca7280a61650357fdc6eb895573c6ddd5843b271f5f3" Oct 11 04:07:22 crc kubenswrapper[4703]: E1011 04:07:22.269107 4703 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"15522e0e68cf63ddf2a0ca7280a61650357fdc6eb895573c6ddd5843b271f5f3\": container with ID starting with 15522e0e68cf63ddf2a0ca7280a61650357fdc6eb895573c6ddd5843b271f5f3 not found: ID does not exist" containerID="15522e0e68cf63ddf2a0ca7280a61650357fdc6eb895573c6ddd5843b271f5f3" Oct 11 04:07:22 crc kubenswrapper[4703]: I1011 04:07:22.269134 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"15522e0e68cf63ddf2a0ca7280a61650357fdc6eb895573c6ddd5843b271f5f3"} err="failed to get container status \"15522e0e68cf63ddf2a0ca7280a61650357fdc6eb895573c6ddd5843b271f5f3\": rpc error: code = NotFound desc = could not find container \"15522e0e68cf63ddf2a0ca7280a61650357fdc6eb895573c6ddd5843b271f5f3\": container with ID starting with 15522e0e68cf63ddf2a0ca7280a61650357fdc6eb895573c6ddd5843b271f5f3 not found: ID does not exist" Oct 11 04:07:22 crc kubenswrapper[4703]: I1011 04:07:22.541339 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/rabbitmq-server-0" event={"ID":"ae0ef79e-e4aa-49b4-a412-f56b3e90c4a8","Type":"ContainerStarted","Data":"9e53156de01fc9916db868edf1c72334086114aa8f69fcd97f455f819abc313a"} Oct 11 04:07:23 crc kubenswrapper[4703]: I1011 04:07:23.551201 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c578ce60-c402-45aa-96cd-98ca13d25a89" path="/var/lib/kubelet/pods/c578ce60-c402-45aa-96cd-98ca13d25a89/volumes" Oct 11 04:07:23 crc kubenswrapper[4703]: I1011 04:07:23.663029 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-r99fj"] Oct 11 04:07:23 crc kubenswrapper[4703]: I1011 04:07:23.663285 4703 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-r99fj" podUID="217ecb68-1935-4fe1-8127-2fc9e2305f16" containerName="registry-server" containerID="cri-o://04b0cf647aa29bbbc4650c1f6ede201f3e30bf3fb6ceeb7d96f4f8f03e30aac4" gracePeriod=2 Oct 11 04:07:24 crc kubenswrapper[4703]: I1011 04:07:24.074367 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-r99fj" Oct 11 04:07:24 crc kubenswrapper[4703]: I1011 04:07:24.128182 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/217ecb68-1935-4fe1-8127-2fc9e2305f16-catalog-content\") pod \"217ecb68-1935-4fe1-8127-2fc9e2305f16\" (UID: \"217ecb68-1935-4fe1-8127-2fc9e2305f16\") " Oct 11 04:07:24 crc kubenswrapper[4703]: I1011 04:07:24.133727 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rckmf\" (UniqueName: \"kubernetes.io/projected/217ecb68-1935-4fe1-8127-2fc9e2305f16-kube-api-access-rckmf\") pod \"217ecb68-1935-4fe1-8127-2fc9e2305f16\" (UID: \"217ecb68-1935-4fe1-8127-2fc9e2305f16\") " Oct 11 04:07:24 crc kubenswrapper[4703]: I1011 04:07:24.133808 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/217ecb68-1935-4fe1-8127-2fc9e2305f16-utilities\") pod \"217ecb68-1935-4fe1-8127-2fc9e2305f16\" (UID: \"217ecb68-1935-4fe1-8127-2fc9e2305f16\") " Oct 11 04:07:24 crc kubenswrapper[4703]: I1011 04:07:24.134553 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/217ecb68-1935-4fe1-8127-2fc9e2305f16-utilities" (OuterVolumeSpecName: "utilities") pod "217ecb68-1935-4fe1-8127-2fc9e2305f16" (UID: "217ecb68-1935-4fe1-8127-2fc9e2305f16"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 04:07:24 crc kubenswrapper[4703]: I1011 04:07:24.138804 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/217ecb68-1935-4fe1-8127-2fc9e2305f16-kube-api-access-rckmf" (OuterVolumeSpecName: "kube-api-access-rckmf") pod "217ecb68-1935-4fe1-8127-2fc9e2305f16" (UID: "217ecb68-1935-4fe1-8127-2fc9e2305f16"). InnerVolumeSpecName "kube-api-access-rckmf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 04:07:24 crc kubenswrapper[4703]: I1011 04:07:24.201689 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/217ecb68-1935-4fe1-8127-2fc9e2305f16-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "217ecb68-1935-4fe1-8127-2fc9e2305f16" (UID: "217ecb68-1935-4fe1-8127-2fc9e2305f16"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 04:07:24 crc kubenswrapper[4703]: I1011 04:07:24.235783 4703 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/217ecb68-1935-4fe1-8127-2fc9e2305f16-utilities\") on node \"crc\" DevicePath \"\"" Oct 11 04:07:24 crc kubenswrapper[4703]: I1011 04:07:24.235808 4703 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/217ecb68-1935-4fe1-8127-2fc9e2305f16-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 11 04:07:24 crc kubenswrapper[4703]: I1011 04:07:24.235819 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rckmf\" (UniqueName: \"kubernetes.io/projected/217ecb68-1935-4fe1-8127-2fc9e2305f16-kube-api-access-rckmf\") on node \"crc\" DevicePath \"\"" Oct 11 04:07:24 crc kubenswrapper[4703]: I1011 04:07:24.557240 4703 generic.go:334] "Generic (PLEG): container finished" podID="217ecb68-1935-4fe1-8127-2fc9e2305f16" containerID="04b0cf647aa29bbbc4650c1f6ede201f3e30bf3fb6ceeb7d96f4f8f03e30aac4" exitCode=0 Oct 11 04:07:24 crc kubenswrapper[4703]: I1011 04:07:24.557272 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r99fj" event={"ID":"217ecb68-1935-4fe1-8127-2fc9e2305f16","Type":"ContainerDied","Data":"04b0cf647aa29bbbc4650c1f6ede201f3e30bf3fb6ceeb7d96f4f8f03e30aac4"} Oct 11 04:07:24 crc kubenswrapper[4703]: I1011 04:07:24.557295 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-r99fj" Oct 11 04:07:24 crc kubenswrapper[4703]: I1011 04:07:24.557313 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r99fj" event={"ID":"217ecb68-1935-4fe1-8127-2fc9e2305f16","Type":"ContainerDied","Data":"bc46b87ca0194ca44d4142c2b839443d0638c4410264a944ac37f46699a2772b"} Oct 11 04:07:24 crc kubenswrapper[4703]: I1011 04:07:24.557362 4703 scope.go:117] "RemoveContainer" containerID="04b0cf647aa29bbbc4650c1f6ede201f3e30bf3fb6ceeb7d96f4f8f03e30aac4" Oct 11 04:07:24 crc kubenswrapper[4703]: I1011 04:07:24.558677 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-index-6r6dw" event={"ID":"2768ed32-fa2e-42a9-b5d3-be0483d294a8","Type":"ContainerStarted","Data":"5746e77b946a52335163f08e59bd043e7c3b391bf8a444b474f8e4492eac9af8"} Oct 11 04:07:24 crc kubenswrapper[4703]: I1011 04:07:24.575758 4703 scope.go:117] "RemoveContainer" containerID="0f4b1354d7cddbde6fccdd242e4e8ad4cf5a943622b16309d332f0d76caf560d" Oct 11 04:07:24 crc kubenswrapper[4703]: I1011 04:07:24.583541 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-index-6r6dw" podStartSLOduration=3.429100296 podStartE2EDuration="7.583450244s" podCreationTimestamp="2025-10-11 04:07:17 +0000 UTC" firstStartedPulling="2025-10-11 04:07:19.629882247 +0000 UTC m=+730.840364219" lastFinishedPulling="2025-10-11 04:07:23.784232245 +0000 UTC m=+734.994714167" observedRunningTime="2025-10-11 04:07:24.581479651 +0000 UTC m=+735.791961573" watchObservedRunningTime="2025-10-11 04:07:24.583450244 +0000 UTC m=+735.793932186" Oct 11 04:07:24 crc kubenswrapper[4703]: I1011 04:07:24.600728 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-r99fj"] Oct 11 04:07:24 crc kubenswrapper[4703]: I1011 04:07:24.605745 4703 scope.go:117] "RemoveContainer" containerID="aaf8aa77c63689d03b280409f3d98f5289cb881aecf51de89f84a12ee2a48fc0" Oct 11 04:07:24 crc kubenswrapper[4703]: I1011 04:07:24.609912 4703 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-r99fj"] Oct 11 04:07:24 crc kubenswrapper[4703]: I1011 04:07:24.624772 4703 scope.go:117] "RemoveContainer" containerID="04b0cf647aa29bbbc4650c1f6ede201f3e30bf3fb6ceeb7d96f4f8f03e30aac4" Oct 11 04:07:24 crc kubenswrapper[4703]: E1011 04:07:24.626904 4703 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"04b0cf647aa29bbbc4650c1f6ede201f3e30bf3fb6ceeb7d96f4f8f03e30aac4\": container with ID starting with 04b0cf647aa29bbbc4650c1f6ede201f3e30bf3fb6ceeb7d96f4f8f03e30aac4 not found: ID does not exist" containerID="04b0cf647aa29bbbc4650c1f6ede201f3e30bf3fb6ceeb7d96f4f8f03e30aac4" Oct 11 04:07:24 crc kubenswrapper[4703]: I1011 04:07:24.626946 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"04b0cf647aa29bbbc4650c1f6ede201f3e30bf3fb6ceeb7d96f4f8f03e30aac4"} err="failed to get container status \"04b0cf647aa29bbbc4650c1f6ede201f3e30bf3fb6ceeb7d96f4f8f03e30aac4\": rpc error: code = NotFound desc = could not find container \"04b0cf647aa29bbbc4650c1f6ede201f3e30bf3fb6ceeb7d96f4f8f03e30aac4\": container with ID starting with 04b0cf647aa29bbbc4650c1f6ede201f3e30bf3fb6ceeb7d96f4f8f03e30aac4 not found: ID does not exist" Oct 11 04:07:24 crc kubenswrapper[4703]: I1011 04:07:24.626975 4703 scope.go:117] "RemoveContainer" containerID="0f4b1354d7cddbde6fccdd242e4e8ad4cf5a943622b16309d332f0d76caf560d" Oct 11 04:07:24 crc kubenswrapper[4703]: E1011 04:07:24.627474 4703 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0f4b1354d7cddbde6fccdd242e4e8ad4cf5a943622b16309d332f0d76caf560d\": container with ID starting with 0f4b1354d7cddbde6fccdd242e4e8ad4cf5a943622b16309d332f0d76caf560d not found: ID does not exist" containerID="0f4b1354d7cddbde6fccdd242e4e8ad4cf5a943622b16309d332f0d76caf560d" Oct 11 04:07:24 crc kubenswrapper[4703]: I1011 04:07:24.627512 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f4b1354d7cddbde6fccdd242e4e8ad4cf5a943622b16309d332f0d76caf560d"} err="failed to get container status \"0f4b1354d7cddbde6fccdd242e4e8ad4cf5a943622b16309d332f0d76caf560d\": rpc error: code = NotFound desc = could not find container \"0f4b1354d7cddbde6fccdd242e4e8ad4cf5a943622b16309d332f0d76caf560d\": container with ID starting with 0f4b1354d7cddbde6fccdd242e4e8ad4cf5a943622b16309d332f0d76caf560d not found: ID does not exist" Oct 11 04:07:24 crc kubenswrapper[4703]: I1011 04:07:24.627539 4703 scope.go:117] "RemoveContainer" containerID="aaf8aa77c63689d03b280409f3d98f5289cb881aecf51de89f84a12ee2a48fc0" Oct 11 04:07:24 crc kubenswrapper[4703]: E1011 04:07:24.627956 4703 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aaf8aa77c63689d03b280409f3d98f5289cb881aecf51de89f84a12ee2a48fc0\": container with ID starting with aaf8aa77c63689d03b280409f3d98f5289cb881aecf51de89f84a12ee2a48fc0 not found: ID does not exist" containerID="aaf8aa77c63689d03b280409f3d98f5289cb881aecf51de89f84a12ee2a48fc0" Oct 11 04:07:24 crc kubenswrapper[4703]: I1011 04:07:24.627986 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aaf8aa77c63689d03b280409f3d98f5289cb881aecf51de89f84a12ee2a48fc0"} err="failed to get container status \"aaf8aa77c63689d03b280409f3d98f5289cb881aecf51de89f84a12ee2a48fc0\": rpc error: code = NotFound desc = could not find container \"aaf8aa77c63689d03b280409f3d98f5289cb881aecf51de89f84a12ee2a48fc0\": container with ID starting with aaf8aa77c63689d03b280409f3d98f5289cb881aecf51de89f84a12ee2a48fc0 not found: ID does not exist" Oct 11 04:07:25 crc kubenswrapper[4703]: I1011 04:07:25.547730 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="217ecb68-1935-4fe1-8127-2fc9e2305f16" path="/var/lib/kubelet/pods/217ecb68-1935-4fe1-8127-2fc9e2305f16/volumes" Oct 11 04:07:28 crc kubenswrapper[4703]: I1011 04:07:28.191399 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-index-6r6dw" Oct 11 04:07:28 crc kubenswrapper[4703]: I1011 04:07:28.191839 4703 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/keystone-operator-index-6r6dw" Oct 11 04:07:28 crc kubenswrapper[4703]: I1011 04:07:28.235306 4703 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/keystone-operator-index-6r6dw" Oct 11 04:07:28 crc kubenswrapper[4703]: I1011 04:07:28.628921 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-index-6r6dw" Oct 11 04:07:31 crc kubenswrapper[4703]: I1011 04:07:31.077601 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-zhw49"] Oct 11 04:07:31 crc kubenswrapper[4703]: E1011 04:07:31.078510 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c578ce60-c402-45aa-96cd-98ca13d25a89" containerName="registry-server" Oct 11 04:07:31 crc kubenswrapper[4703]: I1011 04:07:31.078533 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="c578ce60-c402-45aa-96cd-98ca13d25a89" containerName="registry-server" Oct 11 04:07:31 crc kubenswrapper[4703]: E1011 04:07:31.078569 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="217ecb68-1935-4fe1-8127-2fc9e2305f16" containerName="registry-server" Oct 11 04:07:31 crc kubenswrapper[4703]: I1011 04:07:31.078583 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="217ecb68-1935-4fe1-8127-2fc9e2305f16" containerName="registry-server" Oct 11 04:07:31 crc kubenswrapper[4703]: E1011 04:07:31.078614 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="217ecb68-1935-4fe1-8127-2fc9e2305f16" containerName="extract-content" Oct 11 04:07:31 crc kubenswrapper[4703]: I1011 04:07:31.078651 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="217ecb68-1935-4fe1-8127-2fc9e2305f16" containerName="extract-content" Oct 11 04:07:31 crc kubenswrapper[4703]: E1011 04:07:31.078688 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c578ce60-c402-45aa-96cd-98ca13d25a89" containerName="extract-content" Oct 11 04:07:31 crc kubenswrapper[4703]: I1011 04:07:31.078703 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="c578ce60-c402-45aa-96cd-98ca13d25a89" containerName="extract-content" Oct 11 04:07:31 crc kubenswrapper[4703]: E1011 04:07:31.078720 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c578ce60-c402-45aa-96cd-98ca13d25a89" containerName="extract-utilities" Oct 11 04:07:31 crc kubenswrapper[4703]: I1011 04:07:31.078734 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="c578ce60-c402-45aa-96cd-98ca13d25a89" containerName="extract-utilities" Oct 11 04:07:31 crc kubenswrapper[4703]: E1011 04:07:31.078758 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="217ecb68-1935-4fe1-8127-2fc9e2305f16" containerName="extract-utilities" Oct 11 04:07:31 crc kubenswrapper[4703]: I1011 04:07:31.078770 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="217ecb68-1935-4fe1-8127-2fc9e2305f16" containerName="extract-utilities" Oct 11 04:07:31 crc kubenswrapper[4703]: I1011 04:07:31.078963 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="217ecb68-1935-4fe1-8127-2fc9e2305f16" containerName="registry-server" Oct 11 04:07:31 crc kubenswrapper[4703]: I1011 04:07:31.078985 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="c578ce60-c402-45aa-96cd-98ca13d25a89" containerName="registry-server" Oct 11 04:07:31 crc kubenswrapper[4703]: I1011 04:07:31.080505 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zhw49" Oct 11 04:07:31 crc kubenswrapper[4703]: I1011 04:07:31.101678 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-zhw49"] Oct 11 04:07:31 crc kubenswrapper[4703]: I1011 04:07:31.138388 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a0de26a3-8120-4966-b755-f4ad7319b64d-catalog-content\") pod \"redhat-marketplace-zhw49\" (UID: \"a0de26a3-8120-4966-b755-f4ad7319b64d\") " pod="openshift-marketplace/redhat-marketplace-zhw49" Oct 11 04:07:31 crc kubenswrapper[4703]: I1011 04:07:31.139047 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a0de26a3-8120-4966-b755-f4ad7319b64d-utilities\") pod \"redhat-marketplace-zhw49\" (UID: \"a0de26a3-8120-4966-b755-f4ad7319b64d\") " pod="openshift-marketplace/redhat-marketplace-zhw49" Oct 11 04:07:31 crc kubenswrapper[4703]: I1011 04:07:31.139340 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-psr5z\" (UniqueName: \"kubernetes.io/projected/a0de26a3-8120-4966-b755-f4ad7319b64d-kube-api-access-psr5z\") pod \"redhat-marketplace-zhw49\" (UID: \"a0de26a3-8120-4966-b755-f4ad7319b64d\") " pod="openshift-marketplace/redhat-marketplace-zhw49" Oct 11 04:07:31 crc kubenswrapper[4703]: I1011 04:07:31.241190 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a0de26a3-8120-4966-b755-f4ad7319b64d-utilities\") pod \"redhat-marketplace-zhw49\" (UID: \"a0de26a3-8120-4966-b755-f4ad7319b64d\") " pod="openshift-marketplace/redhat-marketplace-zhw49" Oct 11 04:07:31 crc kubenswrapper[4703]: I1011 04:07:31.241258 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-psr5z\" (UniqueName: \"kubernetes.io/projected/a0de26a3-8120-4966-b755-f4ad7319b64d-kube-api-access-psr5z\") pod \"redhat-marketplace-zhw49\" (UID: \"a0de26a3-8120-4966-b755-f4ad7319b64d\") " pod="openshift-marketplace/redhat-marketplace-zhw49" Oct 11 04:07:31 crc kubenswrapper[4703]: I1011 04:07:31.241307 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a0de26a3-8120-4966-b755-f4ad7319b64d-catalog-content\") pod \"redhat-marketplace-zhw49\" (UID: \"a0de26a3-8120-4966-b755-f4ad7319b64d\") " pod="openshift-marketplace/redhat-marketplace-zhw49" Oct 11 04:07:31 crc kubenswrapper[4703]: I1011 04:07:31.242079 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a0de26a3-8120-4966-b755-f4ad7319b64d-utilities\") pod \"redhat-marketplace-zhw49\" (UID: \"a0de26a3-8120-4966-b755-f4ad7319b64d\") " pod="openshift-marketplace/redhat-marketplace-zhw49" Oct 11 04:07:31 crc kubenswrapper[4703]: I1011 04:07:31.242098 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a0de26a3-8120-4966-b755-f4ad7319b64d-catalog-content\") pod \"redhat-marketplace-zhw49\" (UID: \"a0de26a3-8120-4966-b755-f4ad7319b64d\") " pod="openshift-marketplace/redhat-marketplace-zhw49" Oct 11 04:07:31 crc kubenswrapper[4703]: I1011 04:07:31.286340 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-psr5z\" (UniqueName: \"kubernetes.io/projected/a0de26a3-8120-4966-b755-f4ad7319b64d-kube-api-access-psr5z\") pod \"redhat-marketplace-zhw49\" (UID: \"a0de26a3-8120-4966-b755-f4ad7319b64d\") " pod="openshift-marketplace/redhat-marketplace-zhw49" Oct 11 04:07:31 crc kubenswrapper[4703]: I1011 04:07:31.448104 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zhw49" Oct 11 04:07:31 crc kubenswrapper[4703]: I1011 04:07:31.919666 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-zhw49"] Oct 11 04:07:31 crc kubenswrapper[4703]: I1011 04:07:31.945370 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/d41cb49d6d43f96b279e216261e20a64fc2229fe8c9b730010e4cae654z6brz"] Oct 11 04:07:31 crc kubenswrapper[4703]: I1011 04:07:31.946737 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/d41cb49d6d43f96b279e216261e20a64fc2229fe8c9b730010e4cae654z6brz" Oct 11 04:07:31 crc kubenswrapper[4703]: I1011 04:07:31.948846 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-vxrk2" Oct 11 04:07:31 crc kubenswrapper[4703]: I1011 04:07:31.951280 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bsz66\" (UniqueName: \"kubernetes.io/projected/f072bbf3-5db0-44fa-853c-fb41e667cc6e-kube-api-access-bsz66\") pod \"d41cb49d6d43f96b279e216261e20a64fc2229fe8c9b730010e4cae654z6brz\" (UID: \"f072bbf3-5db0-44fa-853c-fb41e667cc6e\") " pod="openstack-operators/d41cb49d6d43f96b279e216261e20a64fc2229fe8c9b730010e4cae654z6brz" Oct 11 04:07:31 crc kubenswrapper[4703]: I1011 04:07:31.951332 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f072bbf3-5db0-44fa-853c-fb41e667cc6e-util\") pod \"d41cb49d6d43f96b279e216261e20a64fc2229fe8c9b730010e4cae654z6brz\" (UID: \"f072bbf3-5db0-44fa-853c-fb41e667cc6e\") " pod="openstack-operators/d41cb49d6d43f96b279e216261e20a64fc2229fe8c9b730010e4cae654z6brz" Oct 11 04:07:31 crc kubenswrapper[4703]: I1011 04:07:31.951401 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f072bbf3-5db0-44fa-853c-fb41e667cc6e-bundle\") pod \"d41cb49d6d43f96b279e216261e20a64fc2229fe8c9b730010e4cae654z6brz\" (UID: \"f072bbf3-5db0-44fa-853c-fb41e667cc6e\") " pod="openstack-operators/d41cb49d6d43f96b279e216261e20a64fc2229fe8c9b730010e4cae654z6brz" Oct 11 04:07:31 crc kubenswrapper[4703]: I1011 04:07:31.961290 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/d41cb49d6d43f96b279e216261e20a64fc2229fe8c9b730010e4cae654z6brz"] Oct 11 04:07:32 crc kubenswrapper[4703]: I1011 04:07:32.052841 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bsz66\" (UniqueName: \"kubernetes.io/projected/f072bbf3-5db0-44fa-853c-fb41e667cc6e-kube-api-access-bsz66\") pod \"d41cb49d6d43f96b279e216261e20a64fc2229fe8c9b730010e4cae654z6brz\" (UID: \"f072bbf3-5db0-44fa-853c-fb41e667cc6e\") " pod="openstack-operators/d41cb49d6d43f96b279e216261e20a64fc2229fe8c9b730010e4cae654z6brz" Oct 11 04:07:32 crc kubenswrapper[4703]: I1011 04:07:32.052891 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f072bbf3-5db0-44fa-853c-fb41e667cc6e-util\") pod \"d41cb49d6d43f96b279e216261e20a64fc2229fe8c9b730010e4cae654z6brz\" (UID: \"f072bbf3-5db0-44fa-853c-fb41e667cc6e\") " pod="openstack-operators/d41cb49d6d43f96b279e216261e20a64fc2229fe8c9b730010e4cae654z6brz" Oct 11 04:07:32 crc kubenswrapper[4703]: I1011 04:07:32.052937 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f072bbf3-5db0-44fa-853c-fb41e667cc6e-bundle\") pod \"d41cb49d6d43f96b279e216261e20a64fc2229fe8c9b730010e4cae654z6brz\" (UID: \"f072bbf3-5db0-44fa-853c-fb41e667cc6e\") " pod="openstack-operators/d41cb49d6d43f96b279e216261e20a64fc2229fe8c9b730010e4cae654z6brz" Oct 11 04:07:32 crc kubenswrapper[4703]: I1011 04:07:32.053425 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f072bbf3-5db0-44fa-853c-fb41e667cc6e-bundle\") pod \"d41cb49d6d43f96b279e216261e20a64fc2229fe8c9b730010e4cae654z6brz\" (UID: \"f072bbf3-5db0-44fa-853c-fb41e667cc6e\") " pod="openstack-operators/d41cb49d6d43f96b279e216261e20a64fc2229fe8c9b730010e4cae654z6brz" Oct 11 04:07:32 crc kubenswrapper[4703]: I1011 04:07:32.053896 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f072bbf3-5db0-44fa-853c-fb41e667cc6e-util\") pod \"d41cb49d6d43f96b279e216261e20a64fc2229fe8c9b730010e4cae654z6brz\" (UID: \"f072bbf3-5db0-44fa-853c-fb41e667cc6e\") " pod="openstack-operators/d41cb49d6d43f96b279e216261e20a64fc2229fe8c9b730010e4cae654z6brz" Oct 11 04:07:32 crc kubenswrapper[4703]: I1011 04:07:32.078315 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bsz66\" (UniqueName: \"kubernetes.io/projected/f072bbf3-5db0-44fa-853c-fb41e667cc6e-kube-api-access-bsz66\") pod \"d41cb49d6d43f96b279e216261e20a64fc2229fe8c9b730010e4cae654z6brz\" (UID: \"f072bbf3-5db0-44fa-853c-fb41e667cc6e\") " pod="openstack-operators/d41cb49d6d43f96b279e216261e20a64fc2229fe8c9b730010e4cae654z6brz" Oct 11 04:07:32 crc kubenswrapper[4703]: I1011 04:07:32.302320 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/d41cb49d6d43f96b279e216261e20a64fc2229fe8c9b730010e4cae654z6brz" Oct 11 04:07:32 crc kubenswrapper[4703]: I1011 04:07:32.621520 4703 generic.go:334] "Generic (PLEG): container finished" podID="a0de26a3-8120-4966-b755-f4ad7319b64d" containerID="5a4a740dc974291f5f3971a247d77b95ce1b7423606fe3dfa1eb9dddfd3ce6c9" exitCode=0 Oct 11 04:07:32 crc kubenswrapper[4703]: I1011 04:07:32.621701 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zhw49" event={"ID":"a0de26a3-8120-4966-b755-f4ad7319b64d","Type":"ContainerDied","Data":"5a4a740dc974291f5f3971a247d77b95ce1b7423606fe3dfa1eb9dddfd3ce6c9"} Oct 11 04:07:32 crc kubenswrapper[4703]: I1011 04:07:32.621874 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zhw49" event={"ID":"a0de26a3-8120-4966-b755-f4ad7319b64d","Type":"ContainerStarted","Data":"e8c63b607a9061a0202c31cb7cae5d2ec5af16da2f5d0c661499785a7688c955"} Oct 11 04:07:32 crc kubenswrapper[4703]: I1011 04:07:32.739050 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/d41cb49d6d43f96b279e216261e20a64fc2229fe8c9b730010e4cae654z6brz"] Oct 11 04:07:33 crc kubenswrapper[4703]: I1011 04:07:33.629510 4703 generic.go:334] "Generic (PLEG): container finished" podID="f072bbf3-5db0-44fa-853c-fb41e667cc6e" containerID="5db9b00bf519b8127c49af6b29e3f14f9324ad83f715601c6d778334c63bacda" exitCode=0 Oct 11 04:07:33 crc kubenswrapper[4703]: I1011 04:07:33.629607 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/d41cb49d6d43f96b279e216261e20a64fc2229fe8c9b730010e4cae654z6brz" event={"ID":"f072bbf3-5db0-44fa-853c-fb41e667cc6e","Type":"ContainerDied","Data":"5db9b00bf519b8127c49af6b29e3f14f9324ad83f715601c6d778334c63bacda"} Oct 11 04:07:33 crc kubenswrapper[4703]: I1011 04:07:33.629963 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/d41cb49d6d43f96b279e216261e20a64fc2229fe8c9b730010e4cae654z6brz" event={"ID":"f072bbf3-5db0-44fa-853c-fb41e667cc6e","Type":"ContainerStarted","Data":"ed8768ddc615597bb58730ca2c7a4771ce11ab537c90bbaa2cfaf1ff72c20b7c"} Oct 11 04:07:33 crc kubenswrapper[4703]: I1011 04:07:33.632007 4703 generic.go:334] "Generic (PLEG): container finished" podID="a0de26a3-8120-4966-b755-f4ad7319b64d" containerID="e9aa29e06bdcfb413482874f4026c2c9dd7bb069e34fbcbe3dd46a950fda1ce4" exitCode=0 Oct 11 04:07:33 crc kubenswrapper[4703]: I1011 04:07:33.632034 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zhw49" event={"ID":"a0de26a3-8120-4966-b755-f4ad7319b64d","Type":"ContainerDied","Data":"e9aa29e06bdcfb413482874f4026c2c9dd7bb069e34fbcbe3dd46a950fda1ce4"} Oct 11 04:07:34 crc kubenswrapper[4703]: I1011 04:07:34.640881 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/d41cb49d6d43f96b279e216261e20a64fc2229fe8c9b730010e4cae654z6brz" event={"ID":"f072bbf3-5db0-44fa-853c-fb41e667cc6e","Type":"ContainerStarted","Data":"542742780c7d67b9e7dee771b5cd0cb61627e42c480109397b983f1901fc158f"} Oct 11 04:07:34 crc kubenswrapper[4703]: I1011 04:07:34.644136 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zhw49" event={"ID":"a0de26a3-8120-4966-b755-f4ad7319b64d","Type":"ContainerStarted","Data":"8855e961cdb120d4b7a50f32f23ca7a3a505a518767d0aa67cd3704db473f729"} Oct 11 04:07:34 crc kubenswrapper[4703]: I1011 04:07:34.677136 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-zhw49" podStartSLOduration=2.268361755 podStartE2EDuration="3.677119812s" podCreationTimestamp="2025-10-11 04:07:31 +0000 UTC" firstStartedPulling="2025-10-11 04:07:32.624124294 +0000 UTC m=+743.834606226" lastFinishedPulling="2025-10-11 04:07:34.032882361 +0000 UTC m=+745.243364283" observedRunningTime="2025-10-11 04:07:34.672667045 +0000 UTC m=+745.883148977" watchObservedRunningTime="2025-10-11 04:07:34.677119812 +0000 UTC m=+745.887601734" Oct 11 04:07:35 crc kubenswrapper[4703]: I1011 04:07:35.651720 4703 generic.go:334] "Generic (PLEG): container finished" podID="f072bbf3-5db0-44fa-853c-fb41e667cc6e" containerID="542742780c7d67b9e7dee771b5cd0cb61627e42c480109397b983f1901fc158f" exitCode=0 Oct 11 04:07:35 crc kubenswrapper[4703]: I1011 04:07:35.651786 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/d41cb49d6d43f96b279e216261e20a64fc2229fe8c9b730010e4cae654z6brz" event={"ID":"f072bbf3-5db0-44fa-853c-fb41e667cc6e","Type":"ContainerDied","Data":"542742780c7d67b9e7dee771b5cd0cb61627e42c480109397b983f1901fc158f"} Oct 11 04:07:36 crc kubenswrapper[4703]: I1011 04:07:36.662120 4703 generic.go:334] "Generic (PLEG): container finished" podID="f072bbf3-5db0-44fa-853c-fb41e667cc6e" containerID="0f7db8993daab9b96414b8e117e77a6f70710562f626e802dd49ee24fbff3c3a" exitCode=0 Oct 11 04:07:36 crc kubenswrapper[4703]: I1011 04:07:36.662196 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/d41cb49d6d43f96b279e216261e20a64fc2229fe8c9b730010e4cae654z6brz" event={"ID":"f072bbf3-5db0-44fa-853c-fb41e667cc6e","Type":"ContainerDied","Data":"0f7db8993daab9b96414b8e117e77a6f70710562f626e802dd49ee24fbff3c3a"} Oct 11 04:07:37 crc kubenswrapper[4703]: I1011 04:07:37.985690 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/d41cb49d6d43f96b279e216261e20a64fc2229fe8c9b730010e4cae654z6brz" Oct 11 04:07:38 crc kubenswrapper[4703]: I1011 04:07:38.040562 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f072bbf3-5db0-44fa-853c-fb41e667cc6e-bundle\") pod \"f072bbf3-5db0-44fa-853c-fb41e667cc6e\" (UID: \"f072bbf3-5db0-44fa-853c-fb41e667cc6e\") " Oct 11 04:07:38 crc kubenswrapper[4703]: I1011 04:07:38.040632 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f072bbf3-5db0-44fa-853c-fb41e667cc6e-util\") pod \"f072bbf3-5db0-44fa-853c-fb41e667cc6e\" (UID: \"f072bbf3-5db0-44fa-853c-fb41e667cc6e\") " Oct 11 04:07:38 crc kubenswrapper[4703]: I1011 04:07:38.040719 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bsz66\" (UniqueName: \"kubernetes.io/projected/f072bbf3-5db0-44fa-853c-fb41e667cc6e-kube-api-access-bsz66\") pod \"f072bbf3-5db0-44fa-853c-fb41e667cc6e\" (UID: \"f072bbf3-5db0-44fa-853c-fb41e667cc6e\") " Oct 11 04:07:38 crc kubenswrapper[4703]: I1011 04:07:38.041443 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f072bbf3-5db0-44fa-853c-fb41e667cc6e-bundle" (OuterVolumeSpecName: "bundle") pod "f072bbf3-5db0-44fa-853c-fb41e667cc6e" (UID: "f072bbf3-5db0-44fa-853c-fb41e667cc6e"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 04:07:38 crc kubenswrapper[4703]: I1011 04:07:38.046898 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f072bbf3-5db0-44fa-853c-fb41e667cc6e-kube-api-access-bsz66" (OuterVolumeSpecName: "kube-api-access-bsz66") pod "f072bbf3-5db0-44fa-853c-fb41e667cc6e" (UID: "f072bbf3-5db0-44fa-853c-fb41e667cc6e"). InnerVolumeSpecName "kube-api-access-bsz66". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 04:07:38 crc kubenswrapper[4703]: I1011 04:07:38.070399 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f072bbf3-5db0-44fa-853c-fb41e667cc6e-util" (OuterVolumeSpecName: "util") pod "f072bbf3-5db0-44fa-853c-fb41e667cc6e" (UID: "f072bbf3-5db0-44fa-853c-fb41e667cc6e"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 04:07:38 crc kubenswrapper[4703]: I1011 04:07:38.142288 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bsz66\" (UniqueName: \"kubernetes.io/projected/f072bbf3-5db0-44fa-853c-fb41e667cc6e-kube-api-access-bsz66\") on node \"crc\" DevicePath \"\"" Oct 11 04:07:38 crc kubenswrapper[4703]: I1011 04:07:38.142325 4703 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f072bbf3-5db0-44fa-853c-fb41e667cc6e-bundle\") on node \"crc\" DevicePath \"\"" Oct 11 04:07:38 crc kubenswrapper[4703]: I1011 04:07:38.142339 4703 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f072bbf3-5db0-44fa-853c-fb41e667cc6e-util\") on node \"crc\" DevicePath \"\"" Oct 11 04:07:38 crc kubenswrapper[4703]: I1011 04:07:38.675637 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/d41cb49d6d43f96b279e216261e20a64fc2229fe8c9b730010e4cae654z6brz" event={"ID":"f072bbf3-5db0-44fa-853c-fb41e667cc6e","Type":"ContainerDied","Data":"ed8768ddc615597bb58730ca2c7a4771ce11ab537c90bbaa2cfaf1ff72c20b7c"} Oct 11 04:07:38 crc kubenswrapper[4703]: I1011 04:07:38.675678 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/d41cb49d6d43f96b279e216261e20a64fc2229fe8c9b730010e4cae654z6brz" Oct 11 04:07:38 crc kubenswrapper[4703]: I1011 04:07:38.675681 4703 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ed8768ddc615597bb58730ca2c7a4771ce11ab537c90bbaa2cfaf1ff72c20b7c" Oct 11 04:07:41 crc kubenswrapper[4703]: I1011 04:07:41.448847 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-zhw49" Oct 11 04:07:41 crc kubenswrapper[4703]: I1011 04:07:41.449066 4703 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-zhw49" Oct 11 04:07:41 crc kubenswrapper[4703]: I1011 04:07:41.497634 4703 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-zhw49" Oct 11 04:07:41 crc kubenswrapper[4703]: I1011 04:07:41.769719 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-zhw49" Oct 11 04:07:46 crc kubenswrapper[4703]: I1011 04:07:46.259447 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-zhw49"] Oct 11 04:07:46 crc kubenswrapper[4703]: I1011 04:07:46.260216 4703 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-zhw49" podUID="a0de26a3-8120-4966-b755-f4ad7319b64d" containerName="registry-server" containerID="cri-o://8855e961cdb120d4b7a50f32f23ca7a3a505a518767d0aa67cd3704db473f729" gracePeriod=2 Oct 11 04:07:46 crc kubenswrapper[4703]: I1011 04:07:46.683079 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zhw49" Oct 11 04:07:46 crc kubenswrapper[4703]: I1011 04:07:46.750744 4703 generic.go:334] "Generic (PLEG): container finished" podID="a0de26a3-8120-4966-b755-f4ad7319b64d" containerID="8855e961cdb120d4b7a50f32f23ca7a3a505a518767d0aa67cd3704db473f729" exitCode=0 Oct 11 04:07:46 crc kubenswrapper[4703]: I1011 04:07:46.750792 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zhw49" event={"ID":"a0de26a3-8120-4966-b755-f4ad7319b64d","Type":"ContainerDied","Data":"8855e961cdb120d4b7a50f32f23ca7a3a505a518767d0aa67cd3704db473f729"} Oct 11 04:07:46 crc kubenswrapper[4703]: I1011 04:07:46.750819 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zhw49" event={"ID":"a0de26a3-8120-4966-b755-f4ad7319b64d","Type":"ContainerDied","Data":"e8c63b607a9061a0202c31cb7cae5d2ec5af16da2f5d0c661499785a7688c955"} Oct 11 04:07:46 crc kubenswrapper[4703]: I1011 04:07:46.750839 4703 scope.go:117] "RemoveContainer" containerID="8855e961cdb120d4b7a50f32f23ca7a3a505a518767d0aa67cd3704db473f729" Oct 11 04:07:46 crc kubenswrapper[4703]: I1011 04:07:46.750943 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zhw49" Oct 11 04:07:46 crc kubenswrapper[4703]: I1011 04:07:46.770364 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a0de26a3-8120-4966-b755-f4ad7319b64d-utilities\") pod \"a0de26a3-8120-4966-b755-f4ad7319b64d\" (UID: \"a0de26a3-8120-4966-b755-f4ad7319b64d\") " Oct 11 04:07:46 crc kubenswrapper[4703]: I1011 04:07:46.770488 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-psr5z\" (UniqueName: \"kubernetes.io/projected/a0de26a3-8120-4966-b755-f4ad7319b64d-kube-api-access-psr5z\") pod \"a0de26a3-8120-4966-b755-f4ad7319b64d\" (UID: \"a0de26a3-8120-4966-b755-f4ad7319b64d\") " Oct 11 04:07:46 crc kubenswrapper[4703]: I1011 04:07:46.770522 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a0de26a3-8120-4966-b755-f4ad7319b64d-catalog-content\") pod \"a0de26a3-8120-4966-b755-f4ad7319b64d\" (UID: \"a0de26a3-8120-4966-b755-f4ad7319b64d\") " Oct 11 04:07:46 crc kubenswrapper[4703]: I1011 04:07:46.774863 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a0de26a3-8120-4966-b755-f4ad7319b64d-utilities" (OuterVolumeSpecName: "utilities") pod "a0de26a3-8120-4966-b755-f4ad7319b64d" (UID: "a0de26a3-8120-4966-b755-f4ad7319b64d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 04:07:46 crc kubenswrapper[4703]: I1011 04:07:46.782832 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0de26a3-8120-4966-b755-f4ad7319b64d-kube-api-access-psr5z" (OuterVolumeSpecName: "kube-api-access-psr5z") pod "a0de26a3-8120-4966-b755-f4ad7319b64d" (UID: "a0de26a3-8120-4966-b755-f4ad7319b64d"). InnerVolumeSpecName "kube-api-access-psr5z". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 04:07:46 crc kubenswrapper[4703]: I1011 04:07:46.782986 4703 scope.go:117] "RemoveContainer" containerID="e9aa29e06bdcfb413482874f4026c2c9dd7bb069e34fbcbe3dd46a950fda1ce4" Oct 11 04:07:46 crc kubenswrapper[4703]: I1011 04:07:46.789343 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a0de26a3-8120-4966-b755-f4ad7319b64d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a0de26a3-8120-4966-b755-f4ad7319b64d" (UID: "a0de26a3-8120-4966-b755-f4ad7319b64d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 04:07:46 crc kubenswrapper[4703]: I1011 04:07:46.811653 4703 scope.go:117] "RemoveContainer" containerID="5a4a740dc974291f5f3971a247d77b95ce1b7423606fe3dfa1eb9dddfd3ce6c9" Oct 11 04:07:46 crc kubenswrapper[4703]: I1011 04:07:46.833894 4703 scope.go:117] "RemoveContainer" containerID="8855e961cdb120d4b7a50f32f23ca7a3a505a518767d0aa67cd3704db473f729" Oct 11 04:07:46 crc kubenswrapper[4703]: E1011 04:07:46.834340 4703 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8855e961cdb120d4b7a50f32f23ca7a3a505a518767d0aa67cd3704db473f729\": container with ID starting with 8855e961cdb120d4b7a50f32f23ca7a3a505a518767d0aa67cd3704db473f729 not found: ID does not exist" containerID="8855e961cdb120d4b7a50f32f23ca7a3a505a518767d0aa67cd3704db473f729" Oct 11 04:07:46 crc kubenswrapper[4703]: I1011 04:07:46.834374 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8855e961cdb120d4b7a50f32f23ca7a3a505a518767d0aa67cd3704db473f729"} err="failed to get container status \"8855e961cdb120d4b7a50f32f23ca7a3a505a518767d0aa67cd3704db473f729\": rpc error: code = NotFound desc = could not find container \"8855e961cdb120d4b7a50f32f23ca7a3a505a518767d0aa67cd3704db473f729\": container with ID starting with 8855e961cdb120d4b7a50f32f23ca7a3a505a518767d0aa67cd3704db473f729 not found: ID does not exist" Oct 11 04:07:46 crc kubenswrapper[4703]: I1011 04:07:46.834395 4703 scope.go:117] "RemoveContainer" containerID="e9aa29e06bdcfb413482874f4026c2c9dd7bb069e34fbcbe3dd46a950fda1ce4" Oct 11 04:07:46 crc kubenswrapper[4703]: E1011 04:07:46.834667 4703 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e9aa29e06bdcfb413482874f4026c2c9dd7bb069e34fbcbe3dd46a950fda1ce4\": container with ID starting with e9aa29e06bdcfb413482874f4026c2c9dd7bb069e34fbcbe3dd46a950fda1ce4 not found: ID does not exist" containerID="e9aa29e06bdcfb413482874f4026c2c9dd7bb069e34fbcbe3dd46a950fda1ce4" Oct 11 04:07:46 crc kubenswrapper[4703]: I1011 04:07:46.834688 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9aa29e06bdcfb413482874f4026c2c9dd7bb069e34fbcbe3dd46a950fda1ce4"} err="failed to get container status \"e9aa29e06bdcfb413482874f4026c2c9dd7bb069e34fbcbe3dd46a950fda1ce4\": rpc error: code = NotFound desc = could not find container \"e9aa29e06bdcfb413482874f4026c2c9dd7bb069e34fbcbe3dd46a950fda1ce4\": container with ID starting with e9aa29e06bdcfb413482874f4026c2c9dd7bb069e34fbcbe3dd46a950fda1ce4 not found: ID does not exist" Oct 11 04:07:46 crc kubenswrapper[4703]: I1011 04:07:46.834700 4703 scope.go:117] "RemoveContainer" containerID="5a4a740dc974291f5f3971a247d77b95ce1b7423606fe3dfa1eb9dddfd3ce6c9" Oct 11 04:07:46 crc kubenswrapper[4703]: E1011 04:07:46.834961 4703 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5a4a740dc974291f5f3971a247d77b95ce1b7423606fe3dfa1eb9dddfd3ce6c9\": container with ID starting with 5a4a740dc974291f5f3971a247d77b95ce1b7423606fe3dfa1eb9dddfd3ce6c9 not found: ID does not exist" containerID="5a4a740dc974291f5f3971a247d77b95ce1b7423606fe3dfa1eb9dddfd3ce6c9" Oct 11 04:07:46 crc kubenswrapper[4703]: I1011 04:07:46.834982 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a4a740dc974291f5f3971a247d77b95ce1b7423606fe3dfa1eb9dddfd3ce6c9"} err="failed to get container status \"5a4a740dc974291f5f3971a247d77b95ce1b7423606fe3dfa1eb9dddfd3ce6c9\": rpc error: code = NotFound desc = could not find container \"5a4a740dc974291f5f3971a247d77b95ce1b7423606fe3dfa1eb9dddfd3ce6c9\": container with ID starting with 5a4a740dc974291f5f3971a247d77b95ce1b7423606fe3dfa1eb9dddfd3ce6c9 not found: ID does not exist" Oct 11 04:07:46 crc kubenswrapper[4703]: I1011 04:07:46.872161 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-psr5z\" (UniqueName: \"kubernetes.io/projected/a0de26a3-8120-4966-b755-f4ad7319b64d-kube-api-access-psr5z\") on node \"crc\" DevicePath \"\"" Oct 11 04:07:46 crc kubenswrapper[4703]: I1011 04:07:46.872198 4703 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a0de26a3-8120-4966-b755-f4ad7319b64d-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 11 04:07:46 crc kubenswrapper[4703]: I1011 04:07:46.872207 4703 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a0de26a3-8120-4966-b755-f4ad7319b64d-utilities\") on node \"crc\" DevicePath \"\"" Oct 11 04:07:47 crc kubenswrapper[4703]: I1011 04:07:47.075542 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-zhw49"] Oct 11 04:07:47 crc kubenswrapper[4703]: I1011 04:07:47.079349 4703 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-zhw49"] Oct 11 04:07:47 crc kubenswrapper[4703]: I1011 04:07:47.541970 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0de26a3-8120-4966-b755-f4ad7319b64d" path="/var/lib/kubelet/pods/a0de26a3-8120-4966-b755-f4ad7319b64d/volumes" Oct 11 04:07:48 crc kubenswrapper[4703]: I1011 04:07:48.122517 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-bc7dd8474-ppwvq"] Oct 11 04:07:48 crc kubenswrapper[4703]: E1011 04:07:48.122792 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f072bbf3-5db0-44fa-853c-fb41e667cc6e" containerName="pull" Oct 11 04:07:48 crc kubenswrapper[4703]: I1011 04:07:48.122809 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="f072bbf3-5db0-44fa-853c-fb41e667cc6e" containerName="pull" Oct 11 04:07:48 crc kubenswrapper[4703]: E1011 04:07:48.122824 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0de26a3-8120-4966-b755-f4ad7319b64d" containerName="extract-content" Oct 11 04:07:48 crc kubenswrapper[4703]: I1011 04:07:48.122831 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0de26a3-8120-4966-b755-f4ad7319b64d" containerName="extract-content" Oct 11 04:07:48 crc kubenswrapper[4703]: E1011 04:07:48.122846 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f072bbf3-5db0-44fa-853c-fb41e667cc6e" containerName="util" Oct 11 04:07:48 crc kubenswrapper[4703]: I1011 04:07:48.122854 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="f072bbf3-5db0-44fa-853c-fb41e667cc6e" containerName="util" Oct 11 04:07:48 crc kubenswrapper[4703]: E1011 04:07:48.122864 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0de26a3-8120-4966-b755-f4ad7319b64d" containerName="registry-server" Oct 11 04:07:48 crc kubenswrapper[4703]: I1011 04:07:48.122871 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0de26a3-8120-4966-b755-f4ad7319b64d" containerName="registry-server" Oct 11 04:07:48 crc kubenswrapper[4703]: E1011 04:07:48.122882 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f072bbf3-5db0-44fa-853c-fb41e667cc6e" containerName="extract" Oct 11 04:07:48 crc kubenswrapper[4703]: I1011 04:07:48.122891 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="f072bbf3-5db0-44fa-853c-fb41e667cc6e" containerName="extract" Oct 11 04:07:48 crc kubenswrapper[4703]: E1011 04:07:48.122902 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0de26a3-8120-4966-b755-f4ad7319b64d" containerName="extract-utilities" Oct 11 04:07:48 crc kubenswrapper[4703]: I1011 04:07:48.122910 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0de26a3-8120-4966-b755-f4ad7319b64d" containerName="extract-utilities" Oct 11 04:07:48 crc kubenswrapper[4703]: I1011 04:07:48.123036 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0de26a3-8120-4966-b755-f4ad7319b64d" containerName="registry-server" Oct 11 04:07:48 crc kubenswrapper[4703]: I1011 04:07:48.123058 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="f072bbf3-5db0-44fa-853c-fb41e667cc6e" containerName="extract" Oct 11 04:07:48 crc kubenswrapper[4703]: I1011 04:07:48.123710 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-bc7dd8474-ppwvq" Oct 11 04:07:48 crc kubenswrapper[4703]: I1011 04:07:48.125734 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-52scj" Oct 11 04:07:48 crc kubenswrapper[4703]: I1011 04:07:48.125789 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-service-cert" Oct 11 04:07:48 crc kubenswrapper[4703]: I1011 04:07:48.140999 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-bc7dd8474-ppwvq"] Oct 11 04:07:48 crc kubenswrapper[4703]: I1011 04:07:48.188825 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p7xmp\" (UniqueName: \"kubernetes.io/projected/582b8e4d-9fe3-4e10-b9eb-c13c443128a8-kube-api-access-p7xmp\") pod \"keystone-operator-controller-manager-bc7dd8474-ppwvq\" (UID: \"582b8e4d-9fe3-4e10-b9eb-c13c443128a8\") " pod="openstack-operators/keystone-operator-controller-manager-bc7dd8474-ppwvq" Oct 11 04:07:48 crc kubenswrapper[4703]: I1011 04:07:48.188940 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/582b8e4d-9fe3-4e10-b9eb-c13c443128a8-webhook-cert\") pod \"keystone-operator-controller-manager-bc7dd8474-ppwvq\" (UID: \"582b8e4d-9fe3-4e10-b9eb-c13c443128a8\") " pod="openstack-operators/keystone-operator-controller-manager-bc7dd8474-ppwvq" Oct 11 04:07:48 crc kubenswrapper[4703]: I1011 04:07:48.189044 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/582b8e4d-9fe3-4e10-b9eb-c13c443128a8-apiservice-cert\") pod \"keystone-operator-controller-manager-bc7dd8474-ppwvq\" (UID: \"582b8e4d-9fe3-4e10-b9eb-c13c443128a8\") " pod="openstack-operators/keystone-operator-controller-manager-bc7dd8474-ppwvq" Oct 11 04:07:48 crc kubenswrapper[4703]: I1011 04:07:48.290112 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/582b8e4d-9fe3-4e10-b9eb-c13c443128a8-apiservice-cert\") pod \"keystone-operator-controller-manager-bc7dd8474-ppwvq\" (UID: \"582b8e4d-9fe3-4e10-b9eb-c13c443128a8\") " pod="openstack-operators/keystone-operator-controller-manager-bc7dd8474-ppwvq" Oct 11 04:07:48 crc kubenswrapper[4703]: I1011 04:07:48.290181 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p7xmp\" (UniqueName: \"kubernetes.io/projected/582b8e4d-9fe3-4e10-b9eb-c13c443128a8-kube-api-access-p7xmp\") pod \"keystone-operator-controller-manager-bc7dd8474-ppwvq\" (UID: \"582b8e4d-9fe3-4e10-b9eb-c13c443128a8\") " pod="openstack-operators/keystone-operator-controller-manager-bc7dd8474-ppwvq" Oct 11 04:07:48 crc kubenswrapper[4703]: I1011 04:07:48.290251 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/582b8e4d-9fe3-4e10-b9eb-c13c443128a8-webhook-cert\") pod \"keystone-operator-controller-manager-bc7dd8474-ppwvq\" (UID: \"582b8e4d-9fe3-4e10-b9eb-c13c443128a8\") " pod="openstack-operators/keystone-operator-controller-manager-bc7dd8474-ppwvq" Oct 11 04:07:48 crc kubenswrapper[4703]: I1011 04:07:48.294061 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/582b8e4d-9fe3-4e10-b9eb-c13c443128a8-apiservice-cert\") pod \"keystone-operator-controller-manager-bc7dd8474-ppwvq\" (UID: \"582b8e4d-9fe3-4e10-b9eb-c13c443128a8\") " pod="openstack-operators/keystone-operator-controller-manager-bc7dd8474-ppwvq" Oct 11 04:07:48 crc kubenswrapper[4703]: I1011 04:07:48.296309 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/582b8e4d-9fe3-4e10-b9eb-c13c443128a8-webhook-cert\") pod \"keystone-operator-controller-manager-bc7dd8474-ppwvq\" (UID: \"582b8e4d-9fe3-4e10-b9eb-c13c443128a8\") " pod="openstack-operators/keystone-operator-controller-manager-bc7dd8474-ppwvq" Oct 11 04:07:48 crc kubenswrapper[4703]: I1011 04:07:48.312589 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p7xmp\" (UniqueName: \"kubernetes.io/projected/582b8e4d-9fe3-4e10-b9eb-c13c443128a8-kube-api-access-p7xmp\") pod \"keystone-operator-controller-manager-bc7dd8474-ppwvq\" (UID: \"582b8e4d-9fe3-4e10-b9eb-c13c443128a8\") " pod="openstack-operators/keystone-operator-controller-manager-bc7dd8474-ppwvq" Oct 11 04:07:48 crc kubenswrapper[4703]: I1011 04:07:48.440236 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-bc7dd8474-ppwvq" Oct 11 04:07:48 crc kubenswrapper[4703]: I1011 04:07:48.852311 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-bc7dd8474-ppwvq"] Oct 11 04:07:48 crc kubenswrapper[4703]: W1011 04:07:48.859810 4703 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod582b8e4d_9fe3_4e10_b9eb_c13c443128a8.slice/crio-4bfd3f621a9e83389b19836108a960b65a3b68beec634c6aa86345808475720f WatchSource:0}: Error finding container 4bfd3f621a9e83389b19836108a960b65a3b68beec634c6aa86345808475720f: Status 404 returned error can't find the container with id 4bfd3f621a9e83389b19836108a960b65a3b68beec634c6aa86345808475720f Oct 11 04:07:49 crc kubenswrapper[4703]: I1011 04:07:49.777307 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-bc7dd8474-ppwvq" event={"ID":"582b8e4d-9fe3-4e10-b9eb-c13c443128a8","Type":"ContainerStarted","Data":"4bfd3f621a9e83389b19836108a960b65a3b68beec634c6aa86345808475720f"} Oct 11 04:07:50 crc kubenswrapper[4703]: I1011 04:07:50.254438 4703 patch_prober.go:28] interesting pod/machine-config-daemon-6b7d5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 11 04:07:50 crc kubenswrapper[4703]: I1011 04:07:50.254522 4703 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6b7d5" podUID="74f832fc-6791-47d6-a9b3-07d923e053dc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 11 04:07:51 crc kubenswrapper[4703]: I1011 04:07:51.795962 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-bc7dd8474-ppwvq" event={"ID":"582b8e4d-9fe3-4e10-b9eb-c13c443128a8","Type":"ContainerStarted","Data":"2aa1ed960773bcc560522dc95f19ee1b38152291dbaea9ec5a19dd95e6304ccb"} Oct 11 04:07:51 crc kubenswrapper[4703]: I1011 04:07:51.796526 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-bc7dd8474-ppwvq" event={"ID":"582b8e4d-9fe3-4e10-b9eb-c13c443128a8","Type":"ContainerStarted","Data":"605ffdbed177eb56fe6b9e0dbf15edbc9de1062d4f88a314700d4b377b4ea15f"} Oct 11 04:07:51 crc kubenswrapper[4703]: I1011 04:07:51.796557 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-bc7dd8474-ppwvq" Oct 11 04:07:51 crc kubenswrapper[4703]: I1011 04:07:51.817177 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-bc7dd8474-ppwvq" podStartSLOduration=2.04751443 podStartE2EDuration="3.817157243s" podCreationTimestamp="2025-10-11 04:07:48 +0000 UTC" firstStartedPulling="2025-10-11 04:07:48.862356667 +0000 UTC m=+760.072838589" lastFinishedPulling="2025-10-11 04:07:50.63199948 +0000 UTC m=+761.842481402" observedRunningTime="2025-10-11 04:07:51.810931239 +0000 UTC m=+763.021413191" watchObservedRunningTime="2025-10-11 04:07:51.817157243 +0000 UTC m=+763.027639175" Oct 11 04:07:54 crc kubenswrapper[4703]: I1011 04:07:54.819554 4703 generic.go:334] "Generic (PLEG): container finished" podID="ae0ef79e-e4aa-49b4-a412-f56b3e90c4a8" containerID="9e53156de01fc9916db868edf1c72334086114aa8f69fcd97f455f819abc313a" exitCode=0 Oct 11 04:07:54 crc kubenswrapper[4703]: I1011 04:07:54.819667 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/rabbitmq-server-0" event={"ID":"ae0ef79e-e4aa-49b4-a412-f56b3e90c4a8","Type":"ContainerDied","Data":"9e53156de01fc9916db868edf1c72334086114aa8f69fcd97f455f819abc313a"} Oct 11 04:07:55 crc kubenswrapper[4703]: I1011 04:07:55.827119 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/rabbitmq-server-0" event={"ID":"ae0ef79e-e4aa-49b4-a412-f56b3e90c4a8","Type":"ContainerStarted","Data":"ecf6030b88ab296f4af580df241235c603844e2d8da63b74882683a59a741bc7"} Oct 11 04:07:55 crc kubenswrapper[4703]: I1011 04:07:55.827723 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/rabbitmq-server-0" Oct 11 04:07:55 crc kubenswrapper[4703]: I1011 04:07:55.846779 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/rabbitmq-server-0" podStartSLOduration=37.135635066 podStartE2EDuration="44.846761138s" podCreationTimestamp="2025-10-11 04:07:11 +0000 UTC" firstStartedPulling="2025-10-11 04:07:12.921980711 +0000 UTC m=+724.132462633" lastFinishedPulling="2025-10-11 04:07:20.633106773 +0000 UTC m=+731.843588705" observedRunningTime="2025-10-11 04:07:55.844952301 +0000 UTC m=+767.055434283" watchObservedRunningTime="2025-10-11 04:07:55.846761138 +0000 UTC m=+767.057243060" Oct 11 04:07:58 crc kubenswrapper[4703]: I1011 04:07:58.445335 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-bc7dd8474-ppwvq" Oct 11 04:08:01 crc kubenswrapper[4703]: I1011 04:08:01.370591 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/keystone-db-create-xfg4f"] Oct 11 04:08:01 crc kubenswrapper[4703]: I1011 04:08:01.371597 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-db-create-xfg4f" Oct 11 04:08:01 crc kubenswrapper[4703]: I1011 04:08:01.383048 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/keystone-db-create-xfg4f"] Oct 11 04:08:01 crc kubenswrapper[4703]: I1011 04:08:01.475340 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9cvzn\" (UniqueName: \"kubernetes.io/projected/571abf26-029e-4204-8f2b-63e1bb50bae4-kube-api-access-9cvzn\") pod \"keystone-db-create-xfg4f\" (UID: \"571abf26-029e-4204-8f2b-63e1bb50bae4\") " pod="glance-kuttl-tests/keystone-db-create-xfg4f" Oct 11 04:08:01 crc kubenswrapper[4703]: I1011 04:08:01.576711 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9cvzn\" (UniqueName: \"kubernetes.io/projected/571abf26-029e-4204-8f2b-63e1bb50bae4-kube-api-access-9cvzn\") pod \"keystone-db-create-xfg4f\" (UID: \"571abf26-029e-4204-8f2b-63e1bb50bae4\") " pod="glance-kuttl-tests/keystone-db-create-xfg4f" Oct 11 04:08:01 crc kubenswrapper[4703]: I1011 04:08:01.596587 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9cvzn\" (UniqueName: \"kubernetes.io/projected/571abf26-029e-4204-8f2b-63e1bb50bae4-kube-api-access-9cvzn\") pod \"keystone-db-create-xfg4f\" (UID: \"571abf26-029e-4204-8f2b-63e1bb50bae4\") " pod="glance-kuttl-tests/keystone-db-create-xfg4f" Oct 11 04:08:01 crc kubenswrapper[4703]: I1011 04:08:01.721250 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-db-create-xfg4f" Oct 11 04:08:02 crc kubenswrapper[4703]: I1011 04:08:02.187906 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/keystone-db-create-xfg4f"] Oct 11 04:08:02 crc kubenswrapper[4703]: I1011 04:08:02.876391 4703 generic.go:334] "Generic (PLEG): container finished" podID="571abf26-029e-4204-8f2b-63e1bb50bae4" containerID="135c5993db5db105942b9821e4ff461b64967f201f0295570fe178c5ba0c8cd3" exitCode=0 Oct 11 04:08:02 crc kubenswrapper[4703]: I1011 04:08:02.876517 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone-db-create-xfg4f" event={"ID":"571abf26-029e-4204-8f2b-63e1bb50bae4","Type":"ContainerDied","Data":"135c5993db5db105942b9821e4ff461b64967f201f0295570fe178c5ba0c8cd3"} Oct 11 04:08:02 crc kubenswrapper[4703]: I1011 04:08:02.876884 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone-db-create-xfg4f" event={"ID":"571abf26-029e-4204-8f2b-63e1bb50bae4","Type":"ContainerStarted","Data":"f8d5f45e3cd67b0caf88743b57b61eb42d352ad7866d42e7bc9db6d744e4c173"} Oct 11 04:08:04 crc kubenswrapper[4703]: I1011 04:08:04.219558 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-db-create-xfg4f" Oct 11 04:08:04 crc kubenswrapper[4703]: I1011 04:08:04.315497 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9cvzn\" (UniqueName: \"kubernetes.io/projected/571abf26-029e-4204-8f2b-63e1bb50bae4-kube-api-access-9cvzn\") pod \"571abf26-029e-4204-8f2b-63e1bb50bae4\" (UID: \"571abf26-029e-4204-8f2b-63e1bb50bae4\") " Oct 11 04:08:04 crc kubenswrapper[4703]: I1011 04:08:04.320822 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/571abf26-029e-4204-8f2b-63e1bb50bae4-kube-api-access-9cvzn" (OuterVolumeSpecName: "kube-api-access-9cvzn") pod "571abf26-029e-4204-8f2b-63e1bb50bae4" (UID: "571abf26-029e-4204-8f2b-63e1bb50bae4"). InnerVolumeSpecName "kube-api-access-9cvzn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 04:08:04 crc kubenswrapper[4703]: I1011 04:08:04.416690 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9cvzn\" (UniqueName: \"kubernetes.io/projected/571abf26-029e-4204-8f2b-63e1bb50bae4-kube-api-access-9cvzn\") on node \"crc\" DevicePath \"\"" Oct 11 04:08:04 crc kubenswrapper[4703]: I1011 04:08:04.892325 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone-db-create-xfg4f" event={"ID":"571abf26-029e-4204-8f2b-63e1bb50bae4","Type":"ContainerDied","Data":"f8d5f45e3cd67b0caf88743b57b61eb42d352ad7866d42e7bc9db6d744e4c173"} Oct 11 04:08:04 crc kubenswrapper[4703]: I1011 04:08:04.892660 4703 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f8d5f45e3cd67b0caf88743b57b61eb42d352ad7866d42e7bc9db6d744e4c173" Oct 11 04:08:04 crc kubenswrapper[4703]: I1011 04:08:04.892401 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-db-create-xfg4f" Oct 11 04:08:11 crc kubenswrapper[4703]: I1011 04:08:11.284945 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/keystone-ec7d-account-create-p9vkj"] Oct 11 04:08:11 crc kubenswrapper[4703]: E1011 04:08:11.287705 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="571abf26-029e-4204-8f2b-63e1bb50bae4" containerName="mariadb-database-create" Oct 11 04:08:11 crc kubenswrapper[4703]: I1011 04:08:11.287731 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="571abf26-029e-4204-8f2b-63e1bb50bae4" containerName="mariadb-database-create" Oct 11 04:08:11 crc kubenswrapper[4703]: I1011 04:08:11.287950 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="571abf26-029e-4204-8f2b-63e1bb50bae4" containerName="mariadb-database-create" Oct 11 04:08:11 crc kubenswrapper[4703]: I1011 04:08:11.288676 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-ec7d-account-create-p9vkj" Oct 11 04:08:11 crc kubenswrapper[4703]: I1011 04:08:11.293643 4703 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"keystone-db-secret" Oct 11 04:08:11 crc kubenswrapper[4703]: I1011 04:08:11.294164 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/keystone-ec7d-account-create-p9vkj"] Oct 11 04:08:11 crc kubenswrapper[4703]: I1011 04:08:11.419238 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zfwb2\" (UniqueName: \"kubernetes.io/projected/6d50c0fe-ad27-488e-93e1-e5bb9d6586a6-kube-api-access-zfwb2\") pod \"keystone-ec7d-account-create-p9vkj\" (UID: \"6d50c0fe-ad27-488e-93e1-e5bb9d6586a6\") " pod="glance-kuttl-tests/keystone-ec7d-account-create-p9vkj" Oct 11 04:08:11 crc kubenswrapper[4703]: I1011 04:08:11.521042 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zfwb2\" (UniqueName: \"kubernetes.io/projected/6d50c0fe-ad27-488e-93e1-e5bb9d6586a6-kube-api-access-zfwb2\") pod \"keystone-ec7d-account-create-p9vkj\" (UID: \"6d50c0fe-ad27-488e-93e1-e5bb9d6586a6\") " pod="glance-kuttl-tests/keystone-ec7d-account-create-p9vkj" Oct 11 04:08:11 crc kubenswrapper[4703]: I1011 04:08:11.553001 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zfwb2\" (UniqueName: \"kubernetes.io/projected/6d50c0fe-ad27-488e-93e1-e5bb9d6586a6-kube-api-access-zfwb2\") pod \"keystone-ec7d-account-create-p9vkj\" (UID: \"6d50c0fe-ad27-488e-93e1-e5bb9d6586a6\") " pod="glance-kuttl-tests/keystone-ec7d-account-create-p9vkj" Oct 11 04:08:11 crc kubenswrapper[4703]: I1011 04:08:11.616435 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-ec7d-account-create-p9vkj" Oct 11 04:08:12 crc kubenswrapper[4703]: I1011 04:08:12.054294 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/keystone-ec7d-account-create-p9vkj"] Oct 11 04:08:12 crc kubenswrapper[4703]: W1011 04:08:12.061804 4703 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6d50c0fe_ad27_488e_93e1_e5bb9d6586a6.slice/crio-7495e10fcad665797b9d7ca29967b89eb3d79f6d9e89d2ab40b3598577668ad4 WatchSource:0}: Error finding container 7495e10fcad665797b9d7ca29967b89eb3d79f6d9e89d2ab40b3598577668ad4: Status 404 returned error can't find the container with id 7495e10fcad665797b9d7ca29967b89eb3d79f6d9e89d2ab40b3598577668ad4 Oct 11 04:08:12 crc kubenswrapper[4703]: I1011 04:08:12.460621 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/rabbitmq-server-0" Oct 11 04:08:12 crc kubenswrapper[4703]: I1011 04:08:12.962172 4703 generic.go:334] "Generic (PLEG): container finished" podID="6d50c0fe-ad27-488e-93e1-e5bb9d6586a6" containerID="71b5ade3a1c496fb558036f4a60a8d9b76747c5aa899df596f498a978c78ba64" exitCode=0 Oct 11 04:08:12 crc kubenswrapper[4703]: I1011 04:08:12.962223 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone-ec7d-account-create-p9vkj" event={"ID":"6d50c0fe-ad27-488e-93e1-e5bb9d6586a6","Type":"ContainerDied","Data":"71b5ade3a1c496fb558036f4a60a8d9b76747c5aa899df596f498a978c78ba64"} Oct 11 04:08:12 crc kubenswrapper[4703]: I1011 04:08:12.962251 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone-ec7d-account-create-p9vkj" event={"ID":"6d50c0fe-ad27-488e-93e1-e5bb9d6586a6","Type":"ContainerStarted","Data":"7495e10fcad665797b9d7ca29967b89eb3d79f6d9e89d2ab40b3598577668ad4"} Oct 11 04:08:14 crc kubenswrapper[4703]: I1011 04:08:14.294211 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-ec7d-account-create-p9vkj" Oct 11 04:08:14 crc kubenswrapper[4703]: I1011 04:08:14.461315 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zfwb2\" (UniqueName: \"kubernetes.io/projected/6d50c0fe-ad27-488e-93e1-e5bb9d6586a6-kube-api-access-zfwb2\") pod \"6d50c0fe-ad27-488e-93e1-e5bb9d6586a6\" (UID: \"6d50c0fe-ad27-488e-93e1-e5bb9d6586a6\") " Oct 11 04:08:14 crc kubenswrapper[4703]: I1011 04:08:14.470148 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d50c0fe-ad27-488e-93e1-e5bb9d6586a6-kube-api-access-zfwb2" (OuterVolumeSpecName: "kube-api-access-zfwb2") pod "6d50c0fe-ad27-488e-93e1-e5bb9d6586a6" (UID: "6d50c0fe-ad27-488e-93e1-e5bb9d6586a6"). InnerVolumeSpecName "kube-api-access-zfwb2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 04:08:14 crc kubenswrapper[4703]: I1011 04:08:14.563134 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zfwb2\" (UniqueName: \"kubernetes.io/projected/6d50c0fe-ad27-488e-93e1-e5bb9d6586a6-kube-api-access-zfwb2\") on node \"crc\" DevicePath \"\"" Oct 11 04:08:14 crc kubenswrapper[4703]: I1011 04:08:14.983105 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone-ec7d-account-create-p9vkj" event={"ID":"6d50c0fe-ad27-488e-93e1-e5bb9d6586a6","Type":"ContainerDied","Data":"7495e10fcad665797b9d7ca29967b89eb3d79f6d9e89d2ab40b3598577668ad4"} Oct 11 04:08:14 crc kubenswrapper[4703]: I1011 04:08:14.983612 4703 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7495e10fcad665797b9d7ca29967b89eb3d79f6d9e89d2ab40b3598577668ad4" Oct 11 04:08:14 crc kubenswrapper[4703]: I1011 04:08:14.983161 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-ec7d-account-create-p9vkj" Oct 11 04:08:15 crc kubenswrapper[4703]: I1011 04:08:15.872174 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-index-psk46"] Oct 11 04:08:15 crc kubenswrapper[4703]: E1011 04:08:15.872564 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d50c0fe-ad27-488e-93e1-e5bb9d6586a6" containerName="mariadb-account-create" Oct 11 04:08:15 crc kubenswrapper[4703]: I1011 04:08:15.872584 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d50c0fe-ad27-488e-93e1-e5bb9d6586a6" containerName="mariadb-account-create" Oct 11 04:08:15 crc kubenswrapper[4703]: I1011 04:08:15.872785 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d50c0fe-ad27-488e-93e1-e5bb9d6586a6" containerName="mariadb-account-create" Oct 11 04:08:15 crc kubenswrapper[4703]: I1011 04:08:15.873734 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-index-psk46" Oct 11 04:08:15 crc kubenswrapper[4703]: I1011 04:08:15.881997 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-index-psk46"] Oct 11 04:08:15 crc kubenswrapper[4703]: I1011 04:08:15.884975 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-index-dockercfg-rsvfm" Oct 11 04:08:15 crc kubenswrapper[4703]: I1011 04:08:15.982766 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8sjq7\" (UniqueName: \"kubernetes.io/projected/e5ee68c0-4760-42ad-8ff7-e1bb6ddd4f90-kube-api-access-8sjq7\") pod \"horizon-operator-index-psk46\" (UID: \"e5ee68c0-4760-42ad-8ff7-e1bb6ddd4f90\") " pod="openstack-operators/horizon-operator-index-psk46" Oct 11 04:08:16 crc kubenswrapper[4703]: I1011 04:08:16.066282 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-index-vkl46"] Oct 11 04:08:16 crc kubenswrapper[4703]: I1011 04:08:16.067233 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-index-vkl46" Oct 11 04:08:16 crc kubenswrapper[4703]: I1011 04:08:16.069705 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-index-dockercfg-9zmkr" Oct 11 04:08:16 crc kubenswrapper[4703]: I1011 04:08:16.077208 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-index-vkl46"] Oct 11 04:08:16 crc kubenswrapper[4703]: I1011 04:08:16.084122 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8sjq7\" (UniqueName: \"kubernetes.io/projected/e5ee68c0-4760-42ad-8ff7-e1bb6ddd4f90-kube-api-access-8sjq7\") pod \"horizon-operator-index-psk46\" (UID: \"e5ee68c0-4760-42ad-8ff7-e1bb6ddd4f90\") " pod="openstack-operators/horizon-operator-index-psk46" Oct 11 04:08:16 crc kubenswrapper[4703]: I1011 04:08:16.108516 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8sjq7\" (UniqueName: \"kubernetes.io/projected/e5ee68c0-4760-42ad-8ff7-e1bb6ddd4f90-kube-api-access-8sjq7\") pod \"horizon-operator-index-psk46\" (UID: \"e5ee68c0-4760-42ad-8ff7-e1bb6ddd4f90\") " pod="openstack-operators/horizon-operator-index-psk46" Oct 11 04:08:16 crc kubenswrapper[4703]: I1011 04:08:16.185313 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xcvqm\" (UniqueName: \"kubernetes.io/projected/da7dfdee-6374-4766-9600-13adbf52e3ed-kube-api-access-xcvqm\") pod \"swift-operator-index-vkl46\" (UID: \"da7dfdee-6374-4766-9600-13adbf52e3ed\") " pod="openstack-operators/swift-operator-index-vkl46" Oct 11 04:08:16 crc kubenswrapper[4703]: I1011 04:08:16.200170 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-index-psk46" Oct 11 04:08:16 crc kubenswrapper[4703]: I1011 04:08:16.287210 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xcvqm\" (UniqueName: \"kubernetes.io/projected/da7dfdee-6374-4766-9600-13adbf52e3ed-kube-api-access-xcvqm\") pod \"swift-operator-index-vkl46\" (UID: \"da7dfdee-6374-4766-9600-13adbf52e3ed\") " pod="openstack-operators/swift-operator-index-vkl46" Oct 11 04:08:16 crc kubenswrapper[4703]: I1011 04:08:16.314276 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xcvqm\" (UniqueName: \"kubernetes.io/projected/da7dfdee-6374-4766-9600-13adbf52e3ed-kube-api-access-xcvqm\") pod \"swift-operator-index-vkl46\" (UID: \"da7dfdee-6374-4766-9600-13adbf52e3ed\") " pod="openstack-operators/swift-operator-index-vkl46" Oct 11 04:08:16 crc kubenswrapper[4703]: I1011 04:08:16.385271 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-index-vkl46" Oct 11 04:08:16 crc kubenswrapper[4703]: I1011 04:08:16.649563 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/keystone-db-sync-bdtz7"] Oct 11 04:08:16 crc kubenswrapper[4703]: I1011 04:08:16.650598 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-db-sync-bdtz7" Oct 11 04:08:16 crc kubenswrapper[4703]: I1011 04:08:16.652620 4703 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"keystone-config-data" Oct 11 04:08:16 crc kubenswrapper[4703]: I1011 04:08:16.652847 4703 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"keystone-keystone-dockercfg-xwhhk" Oct 11 04:08:16 crc kubenswrapper[4703]: I1011 04:08:16.653372 4703 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"keystone-scripts" Oct 11 04:08:16 crc kubenswrapper[4703]: I1011 04:08:16.654299 4703 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"keystone" Oct 11 04:08:16 crc kubenswrapper[4703]: I1011 04:08:16.660057 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/keystone-db-sync-bdtz7"] Oct 11 04:08:16 crc kubenswrapper[4703]: I1011 04:08:16.685880 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-index-psk46"] Oct 11 04:08:16 crc kubenswrapper[4703]: I1011 04:08:16.778221 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-index-vkl46"] Oct 11 04:08:16 crc kubenswrapper[4703]: W1011 04:08:16.784932 4703 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podda7dfdee_6374_4766_9600_13adbf52e3ed.slice/crio-cbd7af595a5ad99fb9cf4bcd3af89aadb4211a6e857954a7d95e0ec44cd06389 WatchSource:0}: Error finding container cbd7af595a5ad99fb9cf4bcd3af89aadb4211a6e857954a7d95e0ec44cd06389: Status 404 returned error can't find the container with id cbd7af595a5ad99fb9cf4bcd3af89aadb4211a6e857954a7d95e0ec44cd06389 Oct 11 04:08:16 crc kubenswrapper[4703]: I1011 04:08:16.795157 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vwl52\" (UniqueName: \"kubernetes.io/projected/dd76fb71-0332-453e-b6ee-27d3bcef51c5-kube-api-access-vwl52\") pod \"keystone-db-sync-bdtz7\" (UID: \"dd76fb71-0332-453e-b6ee-27d3bcef51c5\") " pod="glance-kuttl-tests/keystone-db-sync-bdtz7" Oct 11 04:08:16 crc kubenswrapper[4703]: I1011 04:08:16.795260 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd76fb71-0332-453e-b6ee-27d3bcef51c5-config-data\") pod \"keystone-db-sync-bdtz7\" (UID: \"dd76fb71-0332-453e-b6ee-27d3bcef51c5\") " pod="glance-kuttl-tests/keystone-db-sync-bdtz7" Oct 11 04:08:16 crc kubenswrapper[4703]: I1011 04:08:16.897083 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vwl52\" (UniqueName: \"kubernetes.io/projected/dd76fb71-0332-453e-b6ee-27d3bcef51c5-kube-api-access-vwl52\") pod \"keystone-db-sync-bdtz7\" (UID: \"dd76fb71-0332-453e-b6ee-27d3bcef51c5\") " pod="glance-kuttl-tests/keystone-db-sync-bdtz7" Oct 11 04:08:16 crc kubenswrapper[4703]: I1011 04:08:16.897146 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd76fb71-0332-453e-b6ee-27d3bcef51c5-config-data\") pod \"keystone-db-sync-bdtz7\" (UID: \"dd76fb71-0332-453e-b6ee-27d3bcef51c5\") " pod="glance-kuttl-tests/keystone-db-sync-bdtz7" Oct 11 04:08:16 crc kubenswrapper[4703]: I1011 04:08:16.903898 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd76fb71-0332-453e-b6ee-27d3bcef51c5-config-data\") pod \"keystone-db-sync-bdtz7\" (UID: \"dd76fb71-0332-453e-b6ee-27d3bcef51c5\") " pod="glance-kuttl-tests/keystone-db-sync-bdtz7" Oct 11 04:08:16 crc kubenswrapper[4703]: I1011 04:08:16.914581 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vwl52\" (UniqueName: \"kubernetes.io/projected/dd76fb71-0332-453e-b6ee-27d3bcef51c5-kube-api-access-vwl52\") pod \"keystone-db-sync-bdtz7\" (UID: \"dd76fb71-0332-453e-b6ee-27d3bcef51c5\") " pod="glance-kuttl-tests/keystone-db-sync-bdtz7" Oct 11 04:08:16 crc kubenswrapper[4703]: I1011 04:08:16.967778 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-db-sync-bdtz7" Oct 11 04:08:16 crc kubenswrapper[4703]: I1011 04:08:16.996796 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-index-vkl46" event={"ID":"da7dfdee-6374-4766-9600-13adbf52e3ed","Type":"ContainerStarted","Data":"cbd7af595a5ad99fb9cf4bcd3af89aadb4211a6e857954a7d95e0ec44cd06389"} Oct 11 04:08:16 crc kubenswrapper[4703]: I1011 04:08:16.997661 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-index-psk46" event={"ID":"e5ee68c0-4760-42ad-8ff7-e1bb6ddd4f90","Type":"ContainerStarted","Data":"2fe06d2e9a0820cf71a5567863a5ef728eb6f5ced157a07316f69795deb3876a"} Oct 11 04:08:17 crc kubenswrapper[4703]: I1011 04:08:17.234704 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/keystone-db-sync-bdtz7"] Oct 11 04:08:18 crc kubenswrapper[4703]: I1011 04:08:18.007885 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone-db-sync-bdtz7" event={"ID":"dd76fb71-0332-453e-b6ee-27d3bcef51c5","Type":"ContainerStarted","Data":"e055d414dd9bb5877665430162d2c1fdc5df4263f66e39eff67f879f80f82871"} Oct 11 04:08:19 crc kubenswrapper[4703]: I1011 04:08:19.015875 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-index-vkl46" event={"ID":"da7dfdee-6374-4766-9600-13adbf52e3ed","Type":"ContainerStarted","Data":"87d9de5b6a62682cd7dffbd7f968fb94344ae18bbecc6bfb0492ef45e8d8d61f"} Oct 11 04:08:19 crc kubenswrapper[4703]: I1011 04:08:19.018316 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-index-psk46" event={"ID":"e5ee68c0-4760-42ad-8ff7-e1bb6ddd4f90","Type":"ContainerStarted","Data":"43171f0c7cb8246190a279838d7a90beacca973600582f6f9ee4634308ac5fc5"} Oct 11 04:08:19 crc kubenswrapper[4703]: I1011 04:08:19.033248 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-index-vkl46" podStartSLOduration=1.198048458 podStartE2EDuration="3.033197796s" podCreationTimestamp="2025-10-11 04:08:16 +0000 UTC" firstStartedPulling="2025-10-11 04:08:16.787173883 +0000 UTC m=+787.997655815" lastFinishedPulling="2025-10-11 04:08:18.622323231 +0000 UTC m=+789.832805153" observedRunningTime="2025-10-11 04:08:19.031958114 +0000 UTC m=+790.242440056" watchObservedRunningTime="2025-10-11 04:08:19.033197796 +0000 UTC m=+790.243679728" Oct 11 04:08:19 crc kubenswrapper[4703]: I1011 04:08:19.050721 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-index-psk46" podStartSLOduration=2.169767096 podStartE2EDuration="4.050700707s" podCreationTimestamp="2025-10-11 04:08:15 +0000 UTC" firstStartedPulling="2025-10-11 04:08:16.69506914 +0000 UTC m=+787.905551062" lastFinishedPulling="2025-10-11 04:08:18.576002751 +0000 UTC m=+789.786484673" observedRunningTime="2025-10-11 04:08:19.048183511 +0000 UTC m=+790.258665423" watchObservedRunningTime="2025-10-11 04:08:19.050700707 +0000 UTC m=+790.261182629" Oct 11 04:08:20 crc kubenswrapper[4703]: I1011 04:08:20.254710 4703 patch_prober.go:28] interesting pod/machine-config-daemon-6b7d5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 11 04:08:20 crc kubenswrapper[4703]: I1011 04:08:20.254939 4703 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6b7d5" podUID="74f832fc-6791-47d6-a9b3-07d923e053dc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 11 04:08:20 crc kubenswrapper[4703]: I1011 04:08:20.254977 4703 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-6b7d5" Oct 11 04:08:20 crc kubenswrapper[4703]: I1011 04:08:20.255458 4703 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"0e380b9b01cda230cd0b9c8f75d2fe209660f6bd6b7bddb2546a440639390852"} pod="openshift-machine-config-operator/machine-config-daemon-6b7d5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 11 04:08:20 crc kubenswrapper[4703]: I1011 04:08:20.255524 4703 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-6b7d5" podUID="74f832fc-6791-47d6-a9b3-07d923e053dc" containerName="machine-config-daemon" containerID="cri-o://0e380b9b01cda230cd0b9c8f75d2fe209660f6bd6b7bddb2546a440639390852" gracePeriod=600 Oct 11 04:08:21 crc kubenswrapper[4703]: I1011 04:08:21.036004 4703 generic.go:334] "Generic (PLEG): container finished" podID="74f832fc-6791-47d6-a9b3-07d923e053dc" containerID="0e380b9b01cda230cd0b9c8f75d2fe209660f6bd6b7bddb2546a440639390852" exitCode=0 Oct 11 04:08:21 crc kubenswrapper[4703]: I1011 04:08:21.036053 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6b7d5" event={"ID":"74f832fc-6791-47d6-a9b3-07d923e053dc","Type":"ContainerDied","Data":"0e380b9b01cda230cd0b9c8f75d2fe209660f6bd6b7bddb2546a440639390852"} Oct 11 04:08:21 crc kubenswrapper[4703]: I1011 04:08:21.036089 4703 scope.go:117] "RemoveContainer" containerID="0367c11cbaa435a21e96a60e9660edc64fff8a40838e7115f48e8069226f948f" Oct 11 04:08:25 crc kubenswrapper[4703]: I1011 04:08:25.066808 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6b7d5" event={"ID":"74f832fc-6791-47d6-a9b3-07d923e053dc","Type":"ContainerStarted","Data":"16bd76b9758ba6b1af68c1c3f4a9fecfcc95697ade75d95a9f0798b6a633f439"} Oct 11 04:08:25 crc kubenswrapper[4703]: I1011 04:08:25.069159 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone-db-sync-bdtz7" event={"ID":"dd76fb71-0332-453e-b6ee-27d3bcef51c5","Type":"ContainerStarted","Data":"e5dfb31cdbff8a220a0d32ca7d9e05a2755a1728a8137e586eec40f56ee8979c"} Oct 11 04:08:25 crc kubenswrapper[4703]: I1011 04:08:25.094892 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/keystone-db-sync-bdtz7" podStartSLOduration=2.1806094639999998 podStartE2EDuration="9.094868259s" podCreationTimestamp="2025-10-11 04:08:16 +0000 UTC" firstStartedPulling="2025-10-11 04:08:17.240537178 +0000 UTC m=+788.451019120" lastFinishedPulling="2025-10-11 04:08:24.154795993 +0000 UTC m=+795.365277915" observedRunningTime="2025-10-11 04:08:25.093844572 +0000 UTC m=+796.304326494" watchObservedRunningTime="2025-10-11 04:08:25.094868259 +0000 UTC m=+796.305350201" Oct 11 04:08:26 crc kubenswrapper[4703]: I1011 04:08:26.201928 4703 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/horizon-operator-index-psk46" Oct 11 04:08:26 crc kubenswrapper[4703]: I1011 04:08:26.202291 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-index-psk46" Oct 11 04:08:26 crc kubenswrapper[4703]: I1011 04:08:26.247227 4703 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/horizon-operator-index-psk46" Oct 11 04:08:26 crc kubenswrapper[4703]: I1011 04:08:26.386118 4703 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/swift-operator-index-vkl46" Oct 11 04:08:26 crc kubenswrapper[4703]: I1011 04:08:26.386174 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-index-vkl46" Oct 11 04:08:26 crc kubenswrapper[4703]: I1011 04:08:26.422433 4703 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/swift-operator-index-vkl46" Oct 11 04:08:27 crc kubenswrapper[4703]: I1011 04:08:27.106358 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-index-vkl46" Oct 11 04:08:27 crc kubenswrapper[4703]: I1011 04:08:27.107550 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-index-psk46" Oct 11 04:08:28 crc kubenswrapper[4703]: I1011 04:08:28.089315 4703 generic.go:334] "Generic (PLEG): container finished" podID="dd76fb71-0332-453e-b6ee-27d3bcef51c5" containerID="e5dfb31cdbff8a220a0d32ca7d9e05a2755a1728a8137e586eec40f56ee8979c" exitCode=0 Oct 11 04:08:28 crc kubenswrapper[4703]: I1011 04:08:28.089727 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone-db-sync-bdtz7" event={"ID":"dd76fb71-0332-453e-b6ee-27d3bcef51c5","Type":"ContainerDied","Data":"e5dfb31cdbff8a220a0d32ca7d9e05a2755a1728a8137e586eec40f56ee8979c"} Oct 11 04:08:29 crc kubenswrapper[4703]: I1011 04:08:29.418002 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-db-sync-bdtz7" Oct 11 04:08:29 crc kubenswrapper[4703]: I1011 04:08:29.480218 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vwl52\" (UniqueName: \"kubernetes.io/projected/dd76fb71-0332-453e-b6ee-27d3bcef51c5-kube-api-access-vwl52\") pod \"dd76fb71-0332-453e-b6ee-27d3bcef51c5\" (UID: \"dd76fb71-0332-453e-b6ee-27d3bcef51c5\") " Oct 11 04:08:29 crc kubenswrapper[4703]: I1011 04:08:29.480322 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd76fb71-0332-453e-b6ee-27d3bcef51c5-config-data\") pod \"dd76fb71-0332-453e-b6ee-27d3bcef51c5\" (UID: \"dd76fb71-0332-453e-b6ee-27d3bcef51c5\") " Oct 11 04:08:29 crc kubenswrapper[4703]: I1011 04:08:29.485609 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd76fb71-0332-453e-b6ee-27d3bcef51c5-kube-api-access-vwl52" (OuterVolumeSpecName: "kube-api-access-vwl52") pod "dd76fb71-0332-453e-b6ee-27d3bcef51c5" (UID: "dd76fb71-0332-453e-b6ee-27d3bcef51c5"). InnerVolumeSpecName "kube-api-access-vwl52". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 04:08:29 crc kubenswrapper[4703]: I1011 04:08:29.507877 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd76fb71-0332-453e-b6ee-27d3bcef51c5-config-data" (OuterVolumeSpecName: "config-data") pod "dd76fb71-0332-453e-b6ee-27d3bcef51c5" (UID: "dd76fb71-0332-453e-b6ee-27d3bcef51c5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 04:08:29 crc kubenswrapper[4703]: I1011 04:08:29.582350 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vwl52\" (UniqueName: \"kubernetes.io/projected/dd76fb71-0332-453e-b6ee-27d3bcef51c5-kube-api-access-vwl52\") on node \"crc\" DevicePath \"\"" Oct 11 04:08:29 crc kubenswrapper[4703]: I1011 04:08:29.582387 4703 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd76fb71-0332-453e-b6ee-27d3bcef51c5-config-data\") on node \"crc\" DevicePath \"\"" Oct 11 04:08:30 crc kubenswrapper[4703]: I1011 04:08:30.107353 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone-db-sync-bdtz7" event={"ID":"dd76fb71-0332-453e-b6ee-27d3bcef51c5","Type":"ContainerDied","Data":"e055d414dd9bb5877665430162d2c1fdc5df4263f66e39eff67f879f80f82871"} Oct 11 04:08:30 crc kubenswrapper[4703]: I1011 04:08:30.107665 4703 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e055d414dd9bb5877665430162d2c1fdc5df4263f66e39eff67f879f80f82871" Oct 11 04:08:30 crc kubenswrapper[4703]: I1011 04:08:30.107432 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-db-sync-bdtz7" Oct 11 04:08:30 crc kubenswrapper[4703]: I1011 04:08:30.297396 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/keystone-bootstrap-cz7j7"] Oct 11 04:08:30 crc kubenswrapper[4703]: E1011 04:08:30.297724 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd76fb71-0332-453e-b6ee-27d3bcef51c5" containerName="keystone-db-sync" Oct 11 04:08:30 crc kubenswrapper[4703]: I1011 04:08:30.297747 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd76fb71-0332-453e-b6ee-27d3bcef51c5" containerName="keystone-db-sync" Oct 11 04:08:30 crc kubenswrapper[4703]: I1011 04:08:30.297898 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd76fb71-0332-453e-b6ee-27d3bcef51c5" containerName="keystone-db-sync" Oct 11 04:08:30 crc kubenswrapper[4703]: I1011 04:08:30.298385 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-bootstrap-cz7j7" Oct 11 04:08:30 crc kubenswrapper[4703]: I1011 04:08:30.301764 4703 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"keystone-keystone-dockercfg-xwhhk" Oct 11 04:08:30 crc kubenswrapper[4703]: I1011 04:08:30.301794 4703 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"keystone-config-data" Oct 11 04:08:30 crc kubenswrapper[4703]: I1011 04:08:30.305146 4703 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"keystone" Oct 11 04:08:30 crc kubenswrapper[4703]: I1011 04:08:30.305194 4703 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"keystone-scripts" Oct 11 04:08:30 crc kubenswrapper[4703]: I1011 04:08:30.316639 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/keystone-bootstrap-cz7j7"] Oct 11 04:08:30 crc kubenswrapper[4703]: I1011 04:08:30.394131 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b956b3d-6e86-4bd8-9862-f0a421e4ae20-config-data\") pod \"keystone-bootstrap-cz7j7\" (UID: \"7b956b3d-6e86-4bd8-9862-f0a421e4ae20\") " pod="glance-kuttl-tests/keystone-bootstrap-cz7j7" Oct 11 04:08:30 crc kubenswrapper[4703]: I1011 04:08:30.394193 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/7b956b3d-6e86-4bd8-9862-f0a421e4ae20-credential-keys\") pod \"keystone-bootstrap-cz7j7\" (UID: \"7b956b3d-6e86-4bd8-9862-f0a421e4ae20\") " pod="glance-kuttl-tests/keystone-bootstrap-cz7j7" Oct 11 04:08:30 crc kubenswrapper[4703]: I1011 04:08:30.394216 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-snmb6\" (UniqueName: \"kubernetes.io/projected/7b956b3d-6e86-4bd8-9862-f0a421e4ae20-kube-api-access-snmb6\") pod \"keystone-bootstrap-cz7j7\" (UID: \"7b956b3d-6e86-4bd8-9862-f0a421e4ae20\") " pod="glance-kuttl-tests/keystone-bootstrap-cz7j7" Oct 11 04:08:30 crc kubenswrapper[4703]: I1011 04:08:30.394298 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7b956b3d-6e86-4bd8-9862-f0a421e4ae20-fernet-keys\") pod \"keystone-bootstrap-cz7j7\" (UID: \"7b956b3d-6e86-4bd8-9862-f0a421e4ae20\") " pod="glance-kuttl-tests/keystone-bootstrap-cz7j7" Oct 11 04:08:30 crc kubenswrapper[4703]: I1011 04:08:30.394331 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7b956b3d-6e86-4bd8-9862-f0a421e4ae20-scripts\") pod \"keystone-bootstrap-cz7j7\" (UID: \"7b956b3d-6e86-4bd8-9862-f0a421e4ae20\") " pod="glance-kuttl-tests/keystone-bootstrap-cz7j7" Oct 11 04:08:30 crc kubenswrapper[4703]: I1011 04:08:30.495812 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b956b3d-6e86-4bd8-9862-f0a421e4ae20-config-data\") pod \"keystone-bootstrap-cz7j7\" (UID: \"7b956b3d-6e86-4bd8-9862-f0a421e4ae20\") " pod="glance-kuttl-tests/keystone-bootstrap-cz7j7" Oct 11 04:08:30 crc kubenswrapper[4703]: I1011 04:08:30.495885 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/7b956b3d-6e86-4bd8-9862-f0a421e4ae20-credential-keys\") pod \"keystone-bootstrap-cz7j7\" (UID: \"7b956b3d-6e86-4bd8-9862-f0a421e4ae20\") " pod="glance-kuttl-tests/keystone-bootstrap-cz7j7" Oct 11 04:08:30 crc kubenswrapper[4703]: I1011 04:08:30.495916 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-snmb6\" (UniqueName: \"kubernetes.io/projected/7b956b3d-6e86-4bd8-9862-f0a421e4ae20-kube-api-access-snmb6\") pod \"keystone-bootstrap-cz7j7\" (UID: \"7b956b3d-6e86-4bd8-9862-f0a421e4ae20\") " pod="glance-kuttl-tests/keystone-bootstrap-cz7j7" Oct 11 04:08:30 crc kubenswrapper[4703]: I1011 04:08:30.495985 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7b956b3d-6e86-4bd8-9862-f0a421e4ae20-fernet-keys\") pod \"keystone-bootstrap-cz7j7\" (UID: \"7b956b3d-6e86-4bd8-9862-f0a421e4ae20\") " pod="glance-kuttl-tests/keystone-bootstrap-cz7j7" Oct 11 04:08:30 crc kubenswrapper[4703]: I1011 04:08:30.496028 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7b956b3d-6e86-4bd8-9862-f0a421e4ae20-scripts\") pod \"keystone-bootstrap-cz7j7\" (UID: \"7b956b3d-6e86-4bd8-9862-f0a421e4ae20\") " pod="glance-kuttl-tests/keystone-bootstrap-cz7j7" Oct 11 04:08:30 crc kubenswrapper[4703]: I1011 04:08:30.499854 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/7b956b3d-6e86-4bd8-9862-f0a421e4ae20-credential-keys\") pod \"keystone-bootstrap-cz7j7\" (UID: \"7b956b3d-6e86-4bd8-9862-f0a421e4ae20\") " pod="glance-kuttl-tests/keystone-bootstrap-cz7j7" Oct 11 04:08:30 crc kubenswrapper[4703]: I1011 04:08:30.500096 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b956b3d-6e86-4bd8-9862-f0a421e4ae20-config-data\") pod \"keystone-bootstrap-cz7j7\" (UID: \"7b956b3d-6e86-4bd8-9862-f0a421e4ae20\") " pod="glance-kuttl-tests/keystone-bootstrap-cz7j7" Oct 11 04:08:30 crc kubenswrapper[4703]: I1011 04:08:30.500301 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7b956b3d-6e86-4bd8-9862-f0a421e4ae20-fernet-keys\") pod \"keystone-bootstrap-cz7j7\" (UID: \"7b956b3d-6e86-4bd8-9862-f0a421e4ae20\") " pod="glance-kuttl-tests/keystone-bootstrap-cz7j7" Oct 11 04:08:30 crc kubenswrapper[4703]: I1011 04:08:30.500426 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7b956b3d-6e86-4bd8-9862-f0a421e4ae20-scripts\") pod \"keystone-bootstrap-cz7j7\" (UID: \"7b956b3d-6e86-4bd8-9862-f0a421e4ae20\") " pod="glance-kuttl-tests/keystone-bootstrap-cz7j7" Oct 11 04:08:30 crc kubenswrapper[4703]: I1011 04:08:30.511438 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-snmb6\" (UniqueName: \"kubernetes.io/projected/7b956b3d-6e86-4bd8-9862-f0a421e4ae20-kube-api-access-snmb6\") pod \"keystone-bootstrap-cz7j7\" (UID: \"7b956b3d-6e86-4bd8-9862-f0a421e4ae20\") " pod="glance-kuttl-tests/keystone-bootstrap-cz7j7" Oct 11 04:08:30 crc kubenswrapper[4703]: I1011 04:08:30.613960 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-bootstrap-cz7j7" Oct 11 04:08:31 crc kubenswrapper[4703]: I1011 04:08:31.182003 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/keystone-bootstrap-cz7j7"] Oct 11 04:08:32 crc kubenswrapper[4703]: I1011 04:08:32.126379 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone-bootstrap-cz7j7" event={"ID":"7b956b3d-6e86-4bd8-9862-f0a421e4ae20","Type":"ContainerStarted","Data":"426fd0f4fbf8ca8a7f47a8c399f356c75886ad2ad9b9600d1db3f651e0c728f4"} Oct 11 04:08:32 crc kubenswrapper[4703]: I1011 04:08:32.126692 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone-bootstrap-cz7j7" event={"ID":"7b956b3d-6e86-4bd8-9862-f0a421e4ae20","Type":"ContainerStarted","Data":"eb8a09d3d67b78ab69fb065e885f92b7df92b4b50e33814567ae486bb7d983d8"} Oct 11 04:08:32 crc kubenswrapper[4703]: I1011 04:08:32.146083 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/keystone-bootstrap-cz7j7" podStartSLOduration=2.146065189 podStartE2EDuration="2.146065189s" podCreationTimestamp="2025-10-11 04:08:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 04:08:32.14273887 +0000 UTC m=+803.353220802" watchObservedRunningTime="2025-10-11 04:08:32.146065189 +0000 UTC m=+803.356547121" Oct 11 04:08:33 crc kubenswrapper[4703]: I1011 04:08:33.500994 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/8cd0131c5217a421e1cdc6f3fe4164e1b274d2f5d714c8e430ff4eabd1pv9hf"] Oct 11 04:08:33 crc kubenswrapper[4703]: I1011 04:08:33.502338 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/8cd0131c5217a421e1cdc6f3fe4164e1b274d2f5d714c8e430ff4eabd1pv9hf" Oct 11 04:08:33 crc kubenswrapper[4703]: I1011 04:08:33.504140 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-vxrk2" Oct 11 04:08:33 crc kubenswrapper[4703]: I1011 04:08:33.521370 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/8cd0131c5217a421e1cdc6f3fe4164e1b274d2f5d714c8e430ff4eabd1pv9hf"] Oct 11 04:08:33 crc kubenswrapper[4703]: I1011 04:08:33.669686 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6e907691-72dc-445b-a57f-e70d51d3877c-util\") pod \"8cd0131c5217a421e1cdc6f3fe4164e1b274d2f5d714c8e430ff4eabd1pv9hf\" (UID: \"6e907691-72dc-445b-a57f-e70d51d3877c\") " pod="openstack-operators/8cd0131c5217a421e1cdc6f3fe4164e1b274d2f5d714c8e430ff4eabd1pv9hf" Oct 11 04:08:33 crc kubenswrapper[4703]: I1011 04:08:33.670195 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gdbpj\" (UniqueName: \"kubernetes.io/projected/6e907691-72dc-445b-a57f-e70d51d3877c-kube-api-access-gdbpj\") pod \"8cd0131c5217a421e1cdc6f3fe4164e1b274d2f5d714c8e430ff4eabd1pv9hf\" (UID: \"6e907691-72dc-445b-a57f-e70d51d3877c\") " pod="openstack-operators/8cd0131c5217a421e1cdc6f3fe4164e1b274d2f5d714c8e430ff4eabd1pv9hf" Oct 11 04:08:33 crc kubenswrapper[4703]: I1011 04:08:33.670246 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6e907691-72dc-445b-a57f-e70d51d3877c-bundle\") pod \"8cd0131c5217a421e1cdc6f3fe4164e1b274d2f5d714c8e430ff4eabd1pv9hf\" (UID: \"6e907691-72dc-445b-a57f-e70d51d3877c\") " pod="openstack-operators/8cd0131c5217a421e1cdc6f3fe4164e1b274d2f5d714c8e430ff4eabd1pv9hf" Oct 11 04:08:33 crc kubenswrapper[4703]: I1011 04:08:33.772604 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6e907691-72dc-445b-a57f-e70d51d3877c-util\") pod \"8cd0131c5217a421e1cdc6f3fe4164e1b274d2f5d714c8e430ff4eabd1pv9hf\" (UID: \"6e907691-72dc-445b-a57f-e70d51d3877c\") " pod="openstack-operators/8cd0131c5217a421e1cdc6f3fe4164e1b274d2f5d714c8e430ff4eabd1pv9hf" Oct 11 04:08:33 crc kubenswrapper[4703]: I1011 04:08:33.772729 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gdbpj\" (UniqueName: \"kubernetes.io/projected/6e907691-72dc-445b-a57f-e70d51d3877c-kube-api-access-gdbpj\") pod \"8cd0131c5217a421e1cdc6f3fe4164e1b274d2f5d714c8e430ff4eabd1pv9hf\" (UID: \"6e907691-72dc-445b-a57f-e70d51d3877c\") " pod="openstack-operators/8cd0131c5217a421e1cdc6f3fe4164e1b274d2f5d714c8e430ff4eabd1pv9hf" Oct 11 04:08:33 crc kubenswrapper[4703]: I1011 04:08:33.772773 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6e907691-72dc-445b-a57f-e70d51d3877c-bundle\") pod \"8cd0131c5217a421e1cdc6f3fe4164e1b274d2f5d714c8e430ff4eabd1pv9hf\" (UID: \"6e907691-72dc-445b-a57f-e70d51d3877c\") " pod="openstack-operators/8cd0131c5217a421e1cdc6f3fe4164e1b274d2f5d714c8e430ff4eabd1pv9hf" Oct 11 04:08:33 crc kubenswrapper[4703]: I1011 04:08:33.787738 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6e907691-72dc-445b-a57f-e70d51d3877c-bundle\") pod \"8cd0131c5217a421e1cdc6f3fe4164e1b274d2f5d714c8e430ff4eabd1pv9hf\" (UID: \"6e907691-72dc-445b-a57f-e70d51d3877c\") " pod="openstack-operators/8cd0131c5217a421e1cdc6f3fe4164e1b274d2f5d714c8e430ff4eabd1pv9hf" Oct 11 04:08:33 crc kubenswrapper[4703]: I1011 04:08:33.787839 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6e907691-72dc-445b-a57f-e70d51d3877c-util\") pod \"8cd0131c5217a421e1cdc6f3fe4164e1b274d2f5d714c8e430ff4eabd1pv9hf\" (UID: \"6e907691-72dc-445b-a57f-e70d51d3877c\") " pod="openstack-operators/8cd0131c5217a421e1cdc6f3fe4164e1b274d2f5d714c8e430ff4eabd1pv9hf" Oct 11 04:08:33 crc kubenswrapper[4703]: I1011 04:08:33.797720 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gdbpj\" (UniqueName: \"kubernetes.io/projected/6e907691-72dc-445b-a57f-e70d51d3877c-kube-api-access-gdbpj\") pod \"8cd0131c5217a421e1cdc6f3fe4164e1b274d2f5d714c8e430ff4eabd1pv9hf\" (UID: \"6e907691-72dc-445b-a57f-e70d51d3877c\") " pod="openstack-operators/8cd0131c5217a421e1cdc6f3fe4164e1b274d2f5d714c8e430ff4eabd1pv9hf" Oct 11 04:08:33 crc kubenswrapper[4703]: I1011 04:08:33.870257 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/8cd0131c5217a421e1cdc6f3fe4164e1b274d2f5d714c8e430ff4eabd1pv9hf" Oct 11 04:08:34 crc kubenswrapper[4703]: I1011 04:08:34.152233 4703 generic.go:334] "Generic (PLEG): container finished" podID="7b956b3d-6e86-4bd8-9862-f0a421e4ae20" containerID="426fd0f4fbf8ca8a7f47a8c399f356c75886ad2ad9b9600d1db3f651e0c728f4" exitCode=0 Oct 11 04:08:34 crc kubenswrapper[4703]: I1011 04:08:34.152283 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone-bootstrap-cz7j7" event={"ID":"7b956b3d-6e86-4bd8-9862-f0a421e4ae20","Type":"ContainerDied","Data":"426fd0f4fbf8ca8a7f47a8c399f356c75886ad2ad9b9600d1db3f651e0c728f4"} Oct 11 04:08:34 crc kubenswrapper[4703]: I1011 04:08:34.277616 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/8cd0131c5217a421e1cdc6f3fe4164e1b274d2f5d714c8e430ff4eabd1pv9hf"] Oct 11 04:08:34 crc kubenswrapper[4703]: I1011 04:08:34.510688 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/f2628378dfbe9d43c4c77358844c5bb7d39b0ec6a549d0614459ab45beprvfw"] Oct 11 04:08:34 crc kubenswrapper[4703]: I1011 04:08:34.512789 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/f2628378dfbe9d43c4c77358844c5bb7d39b0ec6a549d0614459ab45beprvfw" Oct 11 04:08:34 crc kubenswrapper[4703]: I1011 04:08:34.535641 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/f2628378dfbe9d43c4c77358844c5bb7d39b0ec6a549d0614459ab45beprvfw"] Oct 11 04:08:34 crc kubenswrapper[4703]: I1011 04:08:34.684079 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n6ltf\" (UniqueName: \"kubernetes.io/projected/6d96e09e-5277-45d1-83d1-2230caf6a714-kube-api-access-n6ltf\") pod \"f2628378dfbe9d43c4c77358844c5bb7d39b0ec6a549d0614459ab45beprvfw\" (UID: \"6d96e09e-5277-45d1-83d1-2230caf6a714\") " pod="openstack-operators/f2628378dfbe9d43c4c77358844c5bb7d39b0ec6a549d0614459ab45beprvfw" Oct 11 04:08:34 crc kubenswrapper[4703]: I1011 04:08:34.684214 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6d96e09e-5277-45d1-83d1-2230caf6a714-bundle\") pod \"f2628378dfbe9d43c4c77358844c5bb7d39b0ec6a549d0614459ab45beprvfw\" (UID: \"6d96e09e-5277-45d1-83d1-2230caf6a714\") " pod="openstack-operators/f2628378dfbe9d43c4c77358844c5bb7d39b0ec6a549d0614459ab45beprvfw" Oct 11 04:08:34 crc kubenswrapper[4703]: I1011 04:08:34.684408 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6d96e09e-5277-45d1-83d1-2230caf6a714-util\") pod \"f2628378dfbe9d43c4c77358844c5bb7d39b0ec6a549d0614459ab45beprvfw\" (UID: \"6d96e09e-5277-45d1-83d1-2230caf6a714\") " pod="openstack-operators/f2628378dfbe9d43c4c77358844c5bb7d39b0ec6a549d0614459ab45beprvfw" Oct 11 04:08:34 crc kubenswrapper[4703]: I1011 04:08:34.785369 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6d96e09e-5277-45d1-83d1-2230caf6a714-util\") pod \"f2628378dfbe9d43c4c77358844c5bb7d39b0ec6a549d0614459ab45beprvfw\" (UID: \"6d96e09e-5277-45d1-83d1-2230caf6a714\") " pod="openstack-operators/f2628378dfbe9d43c4c77358844c5bb7d39b0ec6a549d0614459ab45beprvfw" Oct 11 04:08:34 crc kubenswrapper[4703]: I1011 04:08:34.785446 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n6ltf\" (UniqueName: \"kubernetes.io/projected/6d96e09e-5277-45d1-83d1-2230caf6a714-kube-api-access-n6ltf\") pod \"f2628378dfbe9d43c4c77358844c5bb7d39b0ec6a549d0614459ab45beprvfw\" (UID: \"6d96e09e-5277-45d1-83d1-2230caf6a714\") " pod="openstack-operators/f2628378dfbe9d43c4c77358844c5bb7d39b0ec6a549d0614459ab45beprvfw" Oct 11 04:08:34 crc kubenswrapper[4703]: I1011 04:08:34.785534 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6d96e09e-5277-45d1-83d1-2230caf6a714-bundle\") pod \"f2628378dfbe9d43c4c77358844c5bb7d39b0ec6a549d0614459ab45beprvfw\" (UID: \"6d96e09e-5277-45d1-83d1-2230caf6a714\") " pod="openstack-operators/f2628378dfbe9d43c4c77358844c5bb7d39b0ec6a549d0614459ab45beprvfw" Oct 11 04:08:34 crc kubenswrapper[4703]: I1011 04:08:34.785941 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6d96e09e-5277-45d1-83d1-2230caf6a714-bundle\") pod \"f2628378dfbe9d43c4c77358844c5bb7d39b0ec6a549d0614459ab45beprvfw\" (UID: \"6d96e09e-5277-45d1-83d1-2230caf6a714\") " pod="openstack-operators/f2628378dfbe9d43c4c77358844c5bb7d39b0ec6a549d0614459ab45beprvfw" Oct 11 04:08:34 crc kubenswrapper[4703]: I1011 04:08:34.785965 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6d96e09e-5277-45d1-83d1-2230caf6a714-util\") pod \"f2628378dfbe9d43c4c77358844c5bb7d39b0ec6a549d0614459ab45beprvfw\" (UID: \"6d96e09e-5277-45d1-83d1-2230caf6a714\") " pod="openstack-operators/f2628378dfbe9d43c4c77358844c5bb7d39b0ec6a549d0614459ab45beprvfw" Oct 11 04:08:34 crc kubenswrapper[4703]: I1011 04:08:34.808743 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n6ltf\" (UniqueName: \"kubernetes.io/projected/6d96e09e-5277-45d1-83d1-2230caf6a714-kube-api-access-n6ltf\") pod \"f2628378dfbe9d43c4c77358844c5bb7d39b0ec6a549d0614459ab45beprvfw\" (UID: \"6d96e09e-5277-45d1-83d1-2230caf6a714\") " pod="openstack-operators/f2628378dfbe9d43c4c77358844c5bb7d39b0ec6a549d0614459ab45beprvfw" Oct 11 04:08:34 crc kubenswrapper[4703]: I1011 04:08:34.835536 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/f2628378dfbe9d43c4c77358844c5bb7d39b0ec6a549d0614459ab45beprvfw" Oct 11 04:08:35 crc kubenswrapper[4703]: I1011 04:08:35.161726 4703 generic.go:334] "Generic (PLEG): container finished" podID="6e907691-72dc-445b-a57f-e70d51d3877c" containerID="06cba7c264fdc1961615c38843d667f8b0d61a06e21e9aed1efccd72b2d329f7" exitCode=0 Oct 11 04:08:35 crc kubenswrapper[4703]: I1011 04:08:35.161772 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/8cd0131c5217a421e1cdc6f3fe4164e1b274d2f5d714c8e430ff4eabd1pv9hf" event={"ID":"6e907691-72dc-445b-a57f-e70d51d3877c","Type":"ContainerDied","Data":"06cba7c264fdc1961615c38843d667f8b0d61a06e21e9aed1efccd72b2d329f7"} Oct 11 04:08:35 crc kubenswrapper[4703]: I1011 04:08:35.162054 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/8cd0131c5217a421e1cdc6f3fe4164e1b274d2f5d714c8e430ff4eabd1pv9hf" event={"ID":"6e907691-72dc-445b-a57f-e70d51d3877c","Type":"ContainerStarted","Data":"5a225bbd41abc7c3e525a2467195ce2297a52c7799c71f31721dcd7e49245e1c"} Oct 11 04:08:35 crc kubenswrapper[4703]: I1011 04:08:35.267381 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/f2628378dfbe9d43c4c77358844c5bb7d39b0ec6a549d0614459ab45beprvfw"] Oct 11 04:08:35 crc kubenswrapper[4703]: I1011 04:08:35.413603 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-bootstrap-cz7j7" Oct 11 04:08:35 crc kubenswrapper[4703]: I1011 04:08:35.496574 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b956b3d-6e86-4bd8-9862-f0a421e4ae20-config-data\") pod \"7b956b3d-6e86-4bd8-9862-f0a421e4ae20\" (UID: \"7b956b3d-6e86-4bd8-9862-f0a421e4ae20\") " Oct 11 04:08:35 crc kubenswrapper[4703]: I1011 04:08:35.496649 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-snmb6\" (UniqueName: \"kubernetes.io/projected/7b956b3d-6e86-4bd8-9862-f0a421e4ae20-kube-api-access-snmb6\") pod \"7b956b3d-6e86-4bd8-9862-f0a421e4ae20\" (UID: \"7b956b3d-6e86-4bd8-9862-f0a421e4ae20\") " Oct 11 04:08:35 crc kubenswrapper[4703]: I1011 04:08:35.496682 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/7b956b3d-6e86-4bd8-9862-f0a421e4ae20-credential-keys\") pod \"7b956b3d-6e86-4bd8-9862-f0a421e4ae20\" (UID: \"7b956b3d-6e86-4bd8-9862-f0a421e4ae20\") " Oct 11 04:08:35 crc kubenswrapper[4703]: I1011 04:08:35.496742 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7b956b3d-6e86-4bd8-9862-f0a421e4ae20-scripts\") pod \"7b956b3d-6e86-4bd8-9862-f0a421e4ae20\" (UID: \"7b956b3d-6e86-4bd8-9862-f0a421e4ae20\") " Oct 11 04:08:35 crc kubenswrapper[4703]: I1011 04:08:35.496809 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7b956b3d-6e86-4bd8-9862-f0a421e4ae20-fernet-keys\") pod \"7b956b3d-6e86-4bd8-9862-f0a421e4ae20\" (UID: \"7b956b3d-6e86-4bd8-9862-f0a421e4ae20\") " Oct 11 04:08:35 crc kubenswrapper[4703]: I1011 04:08:35.501118 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b956b3d-6e86-4bd8-9862-f0a421e4ae20-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "7b956b3d-6e86-4bd8-9862-f0a421e4ae20" (UID: "7b956b3d-6e86-4bd8-9862-f0a421e4ae20"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 04:08:35 crc kubenswrapper[4703]: I1011 04:08:35.501224 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b956b3d-6e86-4bd8-9862-f0a421e4ae20-scripts" (OuterVolumeSpecName: "scripts") pod "7b956b3d-6e86-4bd8-9862-f0a421e4ae20" (UID: "7b956b3d-6e86-4bd8-9862-f0a421e4ae20"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 04:08:35 crc kubenswrapper[4703]: I1011 04:08:35.501287 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b956b3d-6e86-4bd8-9862-f0a421e4ae20-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "7b956b3d-6e86-4bd8-9862-f0a421e4ae20" (UID: "7b956b3d-6e86-4bd8-9862-f0a421e4ae20"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 04:08:35 crc kubenswrapper[4703]: I1011 04:08:35.501322 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b956b3d-6e86-4bd8-9862-f0a421e4ae20-kube-api-access-snmb6" (OuterVolumeSpecName: "kube-api-access-snmb6") pod "7b956b3d-6e86-4bd8-9862-f0a421e4ae20" (UID: "7b956b3d-6e86-4bd8-9862-f0a421e4ae20"). InnerVolumeSpecName "kube-api-access-snmb6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 04:08:35 crc kubenswrapper[4703]: I1011 04:08:35.512196 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b956b3d-6e86-4bd8-9862-f0a421e4ae20-config-data" (OuterVolumeSpecName: "config-data") pod "7b956b3d-6e86-4bd8-9862-f0a421e4ae20" (UID: "7b956b3d-6e86-4bd8-9862-f0a421e4ae20"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 04:08:35 crc kubenswrapper[4703]: I1011 04:08:35.598454 4703 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7b956b3d-6e86-4bd8-9862-f0a421e4ae20-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 11 04:08:35 crc kubenswrapper[4703]: I1011 04:08:35.598515 4703 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b956b3d-6e86-4bd8-9862-f0a421e4ae20-config-data\") on node \"crc\" DevicePath \"\"" Oct 11 04:08:35 crc kubenswrapper[4703]: I1011 04:08:35.598524 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-snmb6\" (UniqueName: \"kubernetes.io/projected/7b956b3d-6e86-4bd8-9862-f0a421e4ae20-kube-api-access-snmb6\") on node \"crc\" DevicePath \"\"" Oct 11 04:08:35 crc kubenswrapper[4703]: I1011 04:08:35.598534 4703 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/7b956b3d-6e86-4bd8-9862-f0a421e4ae20-credential-keys\") on node \"crc\" DevicePath \"\"" Oct 11 04:08:35 crc kubenswrapper[4703]: I1011 04:08:35.598545 4703 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7b956b3d-6e86-4bd8-9862-f0a421e4ae20-scripts\") on node \"crc\" DevicePath \"\"" Oct 11 04:08:36 crc kubenswrapper[4703]: I1011 04:08:36.169814 4703 generic.go:334] "Generic (PLEG): container finished" podID="6d96e09e-5277-45d1-83d1-2230caf6a714" containerID="1e8bf193a6b9bea3c12540cf7f5571733b5255d471432baf30c1e76e678f5d8c" exitCode=0 Oct 11 04:08:36 crc kubenswrapper[4703]: I1011 04:08:36.169891 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/f2628378dfbe9d43c4c77358844c5bb7d39b0ec6a549d0614459ab45beprvfw" event={"ID":"6d96e09e-5277-45d1-83d1-2230caf6a714","Type":"ContainerDied","Data":"1e8bf193a6b9bea3c12540cf7f5571733b5255d471432baf30c1e76e678f5d8c"} Oct 11 04:08:36 crc kubenswrapper[4703]: I1011 04:08:36.169919 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/f2628378dfbe9d43c4c77358844c5bb7d39b0ec6a549d0614459ab45beprvfw" event={"ID":"6d96e09e-5277-45d1-83d1-2230caf6a714","Type":"ContainerStarted","Data":"eeccda4e0bc2adf61159f826a16b89e4ac32671c2292b80375fe6bed0b1b40f3"} Oct 11 04:08:36 crc kubenswrapper[4703]: I1011 04:08:36.171911 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone-bootstrap-cz7j7" event={"ID":"7b956b3d-6e86-4bd8-9862-f0a421e4ae20","Type":"ContainerDied","Data":"eb8a09d3d67b78ab69fb065e885f92b7df92b4b50e33814567ae486bb7d983d8"} Oct 11 04:08:36 crc kubenswrapper[4703]: I1011 04:08:36.171943 4703 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eb8a09d3d67b78ab69fb065e885f92b7df92b4b50e33814567ae486bb7d983d8" Oct 11 04:08:36 crc kubenswrapper[4703]: I1011 04:08:36.171978 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-bootstrap-cz7j7" Oct 11 04:08:36 crc kubenswrapper[4703]: I1011 04:08:36.283794 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/keystone-76f545d5bb-kbdgh"] Oct 11 04:08:36 crc kubenswrapper[4703]: E1011 04:08:36.284038 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b956b3d-6e86-4bd8-9862-f0a421e4ae20" containerName="keystone-bootstrap" Oct 11 04:08:36 crc kubenswrapper[4703]: I1011 04:08:36.284050 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b956b3d-6e86-4bd8-9862-f0a421e4ae20" containerName="keystone-bootstrap" Oct 11 04:08:36 crc kubenswrapper[4703]: I1011 04:08:36.284152 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b956b3d-6e86-4bd8-9862-f0a421e4ae20" containerName="keystone-bootstrap" Oct 11 04:08:36 crc kubenswrapper[4703]: I1011 04:08:36.284547 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-76f545d5bb-kbdgh" Oct 11 04:08:36 crc kubenswrapper[4703]: I1011 04:08:36.288804 4703 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"keystone" Oct 11 04:08:36 crc kubenswrapper[4703]: I1011 04:08:36.288863 4703 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"keystone-scripts" Oct 11 04:08:36 crc kubenswrapper[4703]: I1011 04:08:36.288881 4703 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"keystone-keystone-dockercfg-xwhhk" Oct 11 04:08:36 crc kubenswrapper[4703]: I1011 04:08:36.289048 4703 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"keystone-config-data" Oct 11 04:08:36 crc kubenswrapper[4703]: I1011 04:08:36.296420 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/keystone-76f545d5bb-kbdgh"] Oct 11 04:08:36 crc kubenswrapper[4703]: I1011 04:08:36.410112 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8e0390a0-7f33-42a4-9657-3796ac67f9a5-fernet-keys\") pod \"keystone-76f545d5bb-kbdgh\" (UID: \"8e0390a0-7f33-42a4-9657-3796ac67f9a5\") " pod="glance-kuttl-tests/keystone-76f545d5bb-kbdgh" Oct 11 04:08:36 crc kubenswrapper[4703]: I1011 04:08:36.410223 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/8e0390a0-7f33-42a4-9657-3796ac67f9a5-credential-keys\") pod \"keystone-76f545d5bb-kbdgh\" (UID: \"8e0390a0-7f33-42a4-9657-3796ac67f9a5\") " pod="glance-kuttl-tests/keystone-76f545d5bb-kbdgh" Oct 11 04:08:36 crc kubenswrapper[4703]: I1011 04:08:36.410256 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t46wk\" (UniqueName: \"kubernetes.io/projected/8e0390a0-7f33-42a4-9657-3796ac67f9a5-kube-api-access-t46wk\") pod \"keystone-76f545d5bb-kbdgh\" (UID: \"8e0390a0-7f33-42a4-9657-3796ac67f9a5\") " pod="glance-kuttl-tests/keystone-76f545d5bb-kbdgh" Oct 11 04:08:36 crc kubenswrapper[4703]: I1011 04:08:36.410284 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8e0390a0-7f33-42a4-9657-3796ac67f9a5-scripts\") pod \"keystone-76f545d5bb-kbdgh\" (UID: \"8e0390a0-7f33-42a4-9657-3796ac67f9a5\") " pod="glance-kuttl-tests/keystone-76f545d5bb-kbdgh" Oct 11 04:08:36 crc kubenswrapper[4703]: I1011 04:08:36.410306 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e0390a0-7f33-42a4-9657-3796ac67f9a5-config-data\") pod \"keystone-76f545d5bb-kbdgh\" (UID: \"8e0390a0-7f33-42a4-9657-3796ac67f9a5\") " pod="glance-kuttl-tests/keystone-76f545d5bb-kbdgh" Oct 11 04:08:36 crc kubenswrapper[4703]: I1011 04:08:36.512099 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8e0390a0-7f33-42a4-9657-3796ac67f9a5-fernet-keys\") pod \"keystone-76f545d5bb-kbdgh\" (UID: \"8e0390a0-7f33-42a4-9657-3796ac67f9a5\") " pod="glance-kuttl-tests/keystone-76f545d5bb-kbdgh" Oct 11 04:08:36 crc kubenswrapper[4703]: I1011 04:08:36.512232 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/8e0390a0-7f33-42a4-9657-3796ac67f9a5-credential-keys\") pod \"keystone-76f545d5bb-kbdgh\" (UID: \"8e0390a0-7f33-42a4-9657-3796ac67f9a5\") " pod="glance-kuttl-tests/keystone-76f545d5bb-kbdgh" Oct 11 04:08:36 crc kubenswrapper[4703]: I1011 04:08:36.512279 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t46wk\" (UniqueName: \"kubernetes.io/projected/8e0390a0-7f33-42a4-9657-3796ac67f9a5-kube-api-access-t46wk\") pod \"keystone-76f545d5bb-kbdgh\" (UID: \"8e0390a0-7f33-42a4-9657-3796ac67f9a5\") " pod="glance-kuttl-tests/keystone-76f545d5bb-kbdgh" Oct 11 04:08:36 crc kubenswrapper[4703]: I1011 04:08:36.512325 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8e0390a0-7f33-42a4-9657-3796ac67f9a5-scripts\") pod \"keystone-76f545d5bb-kbdgh\" (UID: \"8e0390a0-7f33-42a4-9657-3796ac67f9a5\") " pod="glance-kuttl-tests/keystone-76f545d5bb-kbdgh" Oct 11 04:08:36 crc kubenswrapper[4703]: I1011 04:08:36.512376 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e0390a0-7f33-42a4-9657-3796ac67f9a5-config-data\") pod \"keystone-76f545d5bb-kbdgh\" (UID: \"8e0390a0-7f33-42a4-9657-3796ac67f9a5\") " pod="glance-kuttl-tests/keystone-76f545d5bb-kbdgh" Oct 11 04:08:36 crc kubenswrapper[4703]: I1011 04:08:36.518349 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8e0390a0-7f33-42a4-9657-3796ac67f9a5-fernet-keys\") pod \"keystone-76f545d5bb-kbdgh\" (UID: \"8e0390a0-7f33-42a4-9657-3796ac67f9a5\") " pod="glance-kuttl-tests/keystone-76f545d5bb-kbdgh" Oct 11 04:08:36 crc kubenswrapper[4703]: I1011 04:08:36.518409 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/8e0390a0-7f33-42a4-9657-3796ac67f9a5-credential-keys\") pod \"keystone-76f545d5bb-kbdgh\" (UID: \"8e0390a0-7f33-42a4-9657-3796ac67f9a5\") " pod="glance-kuttl-tests/keystone-76f545d5bb-kbdgh" Oct 11 04:08:36 crc kubenswrapper[4703]: I1011 04:08:36.519100 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8e0390a0-7f33-42a4-9657-3796ac67f9a5-scripts\") pod \"keystone-76f545d5bb-kbdgh\" (UID: \"8e0390a0-7f33-42a4-9657-3796ac67f9a5\") " pod="glance-kuttl-tests/keystone-76f545d5bb-kbdgh" Oct 11 04:08:36 crc kubenswrapper[4703]: I1011 04:08:36.520851 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e0390a0-7f33-42a4-9657-3796ac67f9a5-config-data\") pod \"keystone-76f545d5bb-kbdgh\" (UID: \"8e0390a0-7f33-42a4-9657-3796ac67f9a5\") " pod="glance-kuttl-tests/keystone-76f545d5bb-kbdgh" Oct 11 04:08:36 crc kubenswrapper[4703]: I1011 04:08:36.535248 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t46wk\" (UniqueName: \"kubernetes.io/projected/8e0390a0-7f33-42a4-9657-3796ac67f9a5-kube-api-access-t46wk\") pod \"keystone-76f545d5bb-kbdgh\" (UID: \"8e0390a0-7f33-42a4-9657-3796ac67f9a5\") " pod="glance-kuttl-tests/keystone-76f545d5bb-kbdgh" Oct 11 04:08:36 crc kubenswrapper[4703]: I1011 04:08:36.601107 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-76f545d5bb-kbdgh" Oct 11 04:08:37 crc kubenswrapper[4703]: I1011 04:08:37.058606 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/keystone-76f545d5bb-kbdgh"] Oct 11 04:08:37 crc kubenswrapper[4703]: W1011 04:08:37.069025 4703 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8e0390a0_7f33_42a4_9657_3796ac67f9a5.slice/crio-33674ca6275e7ea6d1cf8dbd217a11d53a30379298ac8c051c1f62063bf8be77 WatchSource:0}: Error finding container 33674ca6275e7ea6d1cf8dbd217a11d53a30379298ac8c051c1f62063bf8be77: Status 404 returned error can't find the container with id 33674ca6275e7ea6d1cf8dbd217a11d53a30379298ac8c051c1f62063bf8be77 Oct 11 04:08:37 crc kubenswrapper[4703]: I1011 04:08:37.180089 4703 generic.go:334] "Generic (PLEG): container finished" podID="6e907691-72dc-445b-a57f-e70d51d3877c" containerID="1e7ae5e094d99eddf43470613d34deb8ad5177dd6b793d31455e464ef5d54df2" exitCode=0 Oct 11 04:08:37 crc kubenswrapper[4703]: I1011 04:08:37.180322 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/8cd0131c5217a421e1cdc6f3fe4164e1b274d2f5d714c8e430ff4eabd1pv9hf" event={"ID":"6e907691-72dc-445b-a57f-e70d51d3877c","Type":"ContainerDied","Data":"1e7ae5e094d99eddf43470613d34deb8ad5177dd6b793d31455e464ef5d54df2"} Oct 11 04:08:37 crc kubenswrapper[4703]: I1011 04:08:37.181590 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone-76f545d5bb-kbdgh" event={"ID":"8e0390a0-7f33-42a4-9657-3796ac67f9a5","Type":"ContainerStarted","Data":"33674ca6275e7ea6d1cf8dbd217a11d53a30379298ac8c051c1f62063bf8be77"} Oct 11 04:08:38 crc kubenswrapper[4703]: I1011 04:08:38.195653 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone-76f545d5bb-kbdgh" event={"ID":"8e0390a0-7f33-42a4-9657-3796ac67f9a5","Type":"ContainerStarted","Data":"ee82e2896fac815f896fd9696db2604b8f7a8d626d2f1ebc1981a80497924307"} Oct 11 04:08:38 crc kubenswrapper[4703]: I1011 04:08:38.196259 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/keystone-76f545d5bb-kbdgh" Oct 11 04:08:38 crc kubenswrapper[4703]: I1011 04:08:38.202117 4703 generic.go:334] "Generic (PLEG): container finished" podID="6e907691-72dc-445b-a57f-e70d51d3877c" containerID="4b8f044b08de1e160bd476a65ab9958ee84e54d46e53ebb917a656966dd2a618" exitCode=0 Oct 11 04:08:38 crc kubenswrapper[4703]: I1011 04:08:38.202190 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/8cd0131c5217a421e1cdc6f3fe4164e1b274d2f5d714c8e430ff4eabd1pv9hf" event={"ID":"6e907691-72dc-445b-a57f-e70d51d3877c","Type":"ContainerDied","Data":"4b8f044b08de1e160bd476a65ab9958ee84e54d46e53ebb917a656966dd2a618"} Oct 11 04:08:38 crc kubenswrapper[4703]: I1011 04:08:38.205246 4703 generic.go:334] "Generic (PLEG): container finished" podID="6d96e09e-5277-45d1-83d1-2230caf6a714" containerID="a355b783e517e4a3df08de1c702b92fc274152abc7bcb7908b1eac8027a028bc" exitCode=0 Oct 11 04:08:38 crc kubenswrapper[4703]: I1011 04:08:38.205310 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/f2628378dfbe9d43c4c77358844c5bb7d39b0ec6a549d0614459ab45beprvfw" event={"ID":"6d96e09e-5277-45d1-83d1-2230caf6a714","Type":"ContainerDied","Data":"a355b783e517e4a3df08de1c702b92fc274152abc7bcb7908b1eac8027a028bc"} Oct 11 04:08:38 crc kubenswrapper[4703]: I1011 04:08:38.222604 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/keystone-76f545d5bb-kbdgh" podStartSLOduration=2.222583352 podStartE2EDuration="2.222583352s" podCreationTimestamp="2025-10-11 04:08:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 04:08:38.21608816 +0000 UTC m=+809.426570162" watchObservedRunningTime="2025-10-11 04:08:38.222583352 +0000 UTC m=+809.433065304" Oct 11 04:08:39 crc kubenswrapper[4703]: I1011 04:08:39.220211 4703 generic.go:334] "Generic (PLEG): container finished" podID="6d96e09e-5277-45d1-83d1-2230caf6a714" containerID="b00801c70bcaa4131ecc5334bb67ca45fe25233db29db88b323e7a4c26a1bb59" exitCode=0 Oct 11 04:08:39 crc kubenswrapper[4703]: I1011 04:08:39.220289 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/f2628378dfbe9d43c4c77358844c5bb7d39b0ec6a549d0614459ab45beprvfw" event={"ID":"6d96e09e-5277-45d1-83d1-2230caf6a714","Type":"ContainerDied","Data":"b00801c70bcaa4131ecc5334bb67ca45fe25233db29db88b323e7a4c26a1bb59"} Oct 11 04:08:39 crc kubenswrapper[4703]: I1011 04:08:39.639910 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/8cd0131c5217a421e1cdc6f3fe4164e1b274d2f5d714c8e430ff4eabd1pv9hf" Oct 11 04:08:39 crc kubenswrapper[4703]: I1011 04:08:39.699121 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6e907691-72dc-445b-a57f-e70d51d3877c-bundle\") pod \"6e907691-72dc-445b-a57f-e70d51d3877c\" (UID: \"6e907691-72dc-445b-a57f-e70d51d3877c\") " Oct 11 04:08:39 crc kubenswrapper[4703]: I1011 04:08:39.699208 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6e907691-72dc-445b-a57f-e70d51d3877c-util\") pod \"6e907691-72dc-445b-a57f-e70d51d3877c\" (UID: \"6e907691-72dc-445b-a57f-e70d51d3877c\") " Oct 11 04:08:39 crc kubenswrapper[4703]: I1011 04:08:39.699343 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gdbpj\" (UniqueName: \"kubernetes.io/projected/6e907691-72dc-445b-a57f-e70d51d3877c-kube-api-access-gdbpj\") pod \"6e907691-72dc-445b-a57f-e70d51d3877c\" (UID: \"6e907691-72dc-445b-a57f-e70d51d3877c\") " Oct 11 04:08:39 crc kubenswrapper[4703]: I1011 04:08:39.699927 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6e907691-72dc-445b-a57f-e70d51d3877c-bundle" (OuterVolumeSpecName: "bundle") pod "6e907691-72dc-445b-a57f-e70d51d3877c" (UID: "6e907691-72dc-445b-a57f-e70d51d3877c"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 04:08:39 crc kubenswrapper[4703]: I1011 04:08:39.703951 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e907691-72dc-445b-a57f-e70d51d3877c-kube-api-access-gdbpj" (OuterVolumeSpecName: "kube-api-access-gdbpj") pod "6e907691-72dc-445b-a57f-e70d51d3877c" (UID: "6e907691-72dc-445b-a57f-e70d51d3877c"). InnerVolumeSpecName "kube-api-access-gdbpj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 04:08:39 crc kubenswrapper[4703]: I1011 04:08:39.717922 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6e907691-72dc-445b-a57f-e70d51d3877c-util" (OuterVolumeSpecName: "util") pod "6e907691-72dc-445b-a57f-e70d51d3877c" (UID: "6e907691-72dc-445b-a57f-e70d51d3877c"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 04:08:39 crc kubenswrapper[4703]: I1011 04:08:39.800808 4703 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6e907691-72dc-445b-a57f-e70d51d3877c-util\") on node \"crc\" DevicePath \"\"" Oct 11 04:08:39 crc kubenswrapper[4703]: I1011 04:08:39.800855 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gdbpj\" (UniqueName: \"kubernetes.io/projected/6e907691-72dc-445b-a57f-e70d51d3877c-kube-api-access-gdbpj\") on node \"crc\" DevicePath \"\"" Oct 11 04:08:39 crc kubenswrapper[4703]: I1011 04:08:39.800873 4703 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6e907691-72dc-445b-a57f-e70d51d3877c-bundle\") on node \"crc\" DevicePath \"\"" Oct 11 04:08:40 crc kubenswrapper[4703]: I1011 04:08:40.249103 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/8cd0131c5217a421e1cdc6f3fe4164e1b274d2f5d714c8e430ff4eabd1pv9hf" event={"ID":"6e907691-72dc-445b-a57f-e70d51d3877c","Type":"ContainerDied","Data":"5a225bbd41abc7c3e525a2467195ce2297a52c7799c71f31721dcd7e49245e1c"} Oct 11 04:08:40 crc kubenswrapper[4703]: I1011 04:08:40.249203 4703 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5a225bbd41abc7c3e525a2467195ce2297a52c7799c71f31721dcd7e49245e1c" Oct 11 04:08:40 crc kubenswrapper[4703]: I1011 04:08:40.250539 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/8cd0131c5217a421e1cdc6f3fe4164e1b274d2f5d714c8e430ff4eabd1pv9hf" Oct 11 04:08:40 crc kubenswrapper[4703]: I1011 04:08:40.587280 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/f2628378dfbe9d43c4c77358844c5bb7d39b0ec6a549d0614459ab45beprvfw" Oct 11 04:08:40 crc kubenswrapper[4703]: I1011 04:08:40.609207 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6d96e09e-5277-45d1-83d1-2230caf6a714-bundle\") pod \"6d96e09e-5277-45d1-83d1-2230caf6a714\" (UID: \"6d96e09e-5277-45d1-83d1-2230caf6a714\") " Oct 11 04:08:40 crc kubenswrapper[4703]: I1011 04:08:40.609315 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n6ltf\" (UniqueName: \"kubernetes.io/projected/6d96e09e-5277-45d1-83d1-2230caf6a714-kube-api-access-n6ltf\") pod \"6d96e09e-5277-45d1-83d1-2230caf6a714\" (UID: \"6d96e09e-5277-45d1-83d1-2230caf6a714\") " Oct 11 04:08:40 crc kubenswrapper[4703]: I1011 04:08:40.609367 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6d96e09e-5277-45d1-83d1-2230caf6a714-util\") pod \"6d96e09e-5277-45d1-83d1-2230caf6a714\" (UID: \"6d96e09e-5277-45d1-83d1-2230caf6a714\") " Oct 11 04:08:40 crc kubenswrapper[4703]: I1011 04:08:40.610589 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6d96e09e-5277-45d1-83d1-2230caf6a714-bundle" (OuterVolumeSpecName: "bundle") pod "6d96e09e-5277-45d1-83d1-2230caf6a714" (UID: "6d96e09e-5277-45d1-83d1-2230caf6a714"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 04:08:40 crc kubenswrapper[4703]: I1011 04:08:40.616049 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d96e09e-5277-45d1-83d1-2230caf6a714-kube-api-access-n6ltf" (OuterVolumeSpecName: "kube-api-access-n6ltf") pod "6d96e09e-5277-45d1-83d1-2230caf6a714" (UID: "6d96e09e-5277-45d1-83d1-2230caf6a714"). InnerVolumeSpecName "kube-api-access-n6ltf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 04:08:40 crc kubenswrapper[4703]: I1011 04:08:40.630695 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6d96e09e-5277-45d1-83d1-2230caf6a714-util" (OuterVolumeSpecName: "util") pod "6d96e09e-5277-45d1-83d1-2230caf6a714" (UID: "6d96e09e-5277-45d1-83d1-2230caf6a714"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 04:08:40 crc kubenswrapper[4703]: I1011 04:08:40.710581 4703 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6d96e09e-5277-45d1-83d1-2230caf6a714-bundle\") on node \"crc\" DevicePath \"\"" Oct 11 04:08:40 crc kubenswrapper[4703]: I1011 04:08:40.710611 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n6ltf\" (UniqueName: \"kubernetes.io/projected/6d96e09e-5277-45d1-83d1-2230caf6a714-kube-api-access-n6ltf\") on node \"crc\" DevicePath \"\"" Oct 11 04:08:40 crc kubenswrapper[4703]: I1011 04:08:40.710621 4703 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6d96e09e-5277-45d1-83d1-2230caf6a714-util\") on node \"crc\" DevicePath \"\"" Oct 11 04:08:41 crc kubenswrapper[4703]: I1011 04:08:41.260004 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/f2628378dfbe9d43c4c77358844c5bb7d39b0ec6a549d0614459ab45beprvfw" event={"ID":"6d96e09e-5277-45d1-83d1-2230caf6a714","Type":"ContainerDied","Data":"eeccda4e0bc2adf61159f826a16b89e4ac32671c2292b80375fe6bed0b1b40f3"} Oct 11 04:08:41 crc kubenswrapper[4703]: I1011 04:08:41.260039 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/f2628378dfbe9d43c4c77358844c5bb7d39b0ec6a549d0614459ab45beprvfw" Oct 11 04:08:41 crc kubenswrapper[4703]: I1011 04:08:41.260060 4703 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eeccda4e0bc2adf61159f826a16b89e4ac32671c2292b80375fe6bed0b1b40f3" Oct 11 04:08:53 crc kubenswrapper[4703]: I1011 04:08:53.790706 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-5985b768f7-2w5hq"] Oct 11 04:08:53 crc kubenswrapper[4703]: E1011 04:08:53.791295 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d96e09e-5277-45d1-83d1-2230caf6a714" containerName="pull" Oct 11 04:08:53 crc kubenswrapper[4703]: I1011 04:08:53.791307 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d96e09e-5277-45d1-83d1-2230caf6a714" containerName="pull" Oct 11 04:08:53 crc kubenswrapper[4703]: E1011 04:08:53.791315 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e907691-72dc-445b-a57f-e70d51d3877c" containerName="extract" Oct 11 04:08:53 crc kubenswrapper[4703]: I1011 04:08:53.791321 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e907691-72dc-445b-a57f-e70d51d3877c" containerName="extract" Oct 11 04:08:53 crc kubenswrapper[4703]: E1011 04:08:53.791332 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e907691-72dc-445b-a57f-e70d51d3877c" containerName="pull" Oct 11 04:08:53 crc kubenswrapper[4703]: I1011 04:08:53.791339 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e907691-72dc-445b-a57f-e70d51d3877c" containerName="pull" Oct 11 04:08:53 crc kubenswrapper[4703]: E1011 04:08:53.791347 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d96e09e-5277-45d1-83d1-2230caf6a714" containerName="util" Oct 11 04:08:53 crc kubenswrapper[4703]: I1011 04:08:53.791353 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d96e09e-5277-45d1-83d1-2230caf6a714" containerName="util" Oct 11 04:08:53 crc kubenswrapper[4703]: E1011 04:08:53.791362 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d96e09e-5277-45d1-83d1-2230caf6a714" containerName="extract" Oct 11 04:08:53 crc kubenswrapper[4703]: I1011 04:08:53.791367 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d96e09e-5277-45d1-83d1-2230caf6a714" containerName="extract" Oct 11 04:08:53 crc kubenswrapper[4703]: E1011 04:08:53.791377 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e907691-72dc-445b-a57f-e70d51d3877c" containerName="util" Oct 11 04:08:53 crc kubenswrapper[4703]: I1011 04:08:53.791382 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e907691-72dc-445b-a57f-e70d51d3877c" containerName="util" Oct 11 04:08:53 crc kubenswrapper[4703]: I1011 04:08:53.791513 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d96e09e-5277-45d1-83d1-2230caf6a714" containerName="extract" Oct 11 04:08:53 crc kubenswrapper[4703]: I1011 04:08:53.791524 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e907691-72dc-445b-a57f-e70d51d3877c" containerName="extract" Oct 11 04:08:53 crc kubenswrapper[4703]: I1011 04:08:53.792065 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-5985b768f7-2w5hq" Oct 11 04:08:53 crc kubenswrapper[4703]: I1011 04:08:53.793744 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-n9sm8" Oct 11 04:08:53 crc kubenswrapper[4703]: I1011 04:08:53.793882 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-service-cert" Oct 11 04:08:53 crc kubenswrapper[4703]: I1011 04:08:53.812446 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-5985b768f7-2w5hq"] Oct 11 04:08:53 crc kubenswrapper[4703]: I1011 04:08:53.943767 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/6254e401-fe69-44a8-a16a-4423e12136bf-webhook-cert\") pod \"swift-operator-controller-manager-5985b768f7-2w5hq\" (UID: \"6254e401-fe69-44a8-a16a-4423e12136bf\") " pod="openstack-operators/swift-operator-controller-manager-5985b768f7-2w5hq" Oct 11 04:08:53 crc kubenswrapper[4703]: I1011 04:08:53.943924 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/6254e401-fe69-44a8-a16a-4423e12136bf-apiservice-cert\") pod \"swift-operator-controller-manager-5985b768f7-2w5hq\" (UID: \"6254e401-fe69-44a8-a16a-4423e12136bf\") " pod="openstack-operators/swift-operator-controller-manager-5985b768f7-2w5hq" Oct 11 04:08:53 crc kubenswrapper[4703]: I1011 04:08:53.943983 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bgxqr\" (UniqueName: \"kubernetes.io/projected/6254e401-fe69-44a8-a16a-4423e12136bf-kube-api-access-bgxqr\") pod \"swift-operator-controller-manager-5985b768f7-2w5hq\" (UID: \"6254e401-fe69-44a8-a16a-4423e12136bf\") " pod="openstack-operators/swift-operator-controller-manager-5985b768f7-2w5hq" Oct 11 04:08:54 crc kubenswrapper[4703]: I1011 04:08:54.045151 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/6254e401-fe69-44a8-a16a-4423e12136bf-webhook-cert\") pod \"swift-operator-controller-manager-5985b768f7-2w5hq\" (UID: \"6254e401-fe69-44a8-a16a-4423e12136bf\") " pod="openstack-operators/swift-operator-controller-manager-5985b768f7-2w5hq" Oct 11 04:08:54 crc kubenswrapper[4703]: I1011 04:08:54.045408 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/6254e401-fe69-44a8-a16a-4423e12136bf-apiservice-cert\") pod \"swift-operator-controller-manager-5985b768f7-2w5hq\" (UID: \"6254e401-fe69-44a8-a16a-4423e12136bf\") " pod="openstack-operators/swift-operator-controller-manager-5985b768f7-2w5hq" Oct 11 04:08:54 crc kubenswrapper[4703]: I1011 04:08:54.045623 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bgxqr\" (UniqueName: \"kubernetes.io/projected/6254e401-fe69-44a8-a16a-4423e12136bf-kube-api-access-bgxqr\") pod \"swift-operator-controller-manager-5985b768f7-2w5hq\" (UID: \"6254e401-fe69-44a8-a16a-4423e12136bf\") " pod="openstack-operators/swift-operator-controller-manager-5985b768f7-2w5hq" Oct 11 04:08:54 crc kubenswrapper[4703]: I1011 04:08:54.051573 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/6254e401-fe69-44a8-a16a-4423e12136bf-webhook-cert\") pod \"swift-operator-controller-manager-5985b768f7-2w5hq\" (UID: \"6254e401-fe69-44a8-a16a-4423e12136bf\") " pod="openstack-operators/swift-operator-controller-manager-5985b768f7-2w5hq" Oct 11 04:08:54 crc kubenswrapper[4703]: I1011 04:08:54.055130 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/6254e401-fe69-44a8-a16a-4423e12136bf-apiservice-cert\") pod \"swift-operator-controller-manager-5985b768f7-2w5hq\" (UID: \"6254e401-fe69-44a8-a16a-4423e12136bf\") " pod="openstack-operators/swift-operator-controller-manager-5985b768f7-2w5hq" Oct 11 04:08:54 crc kubenswrapper[4703]: I1011 04:08:54.063166 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bgxqr\" (UniqueName: \"kubernetes.io/projected/6254e401-fe69-44a8-a16a-4423e12136bf-kube-api-access-bgxqr\") pod \"swift-operator-controller-manager-5985b768f7-2w5hq\" (UID: \"6254e401-fe69-44a8-a16a-4423e12136bf\") " pod="openstack-operators/swift-operator-controller-manager-5985b768f7-2w5hq" Oct 11 04:08:54 crc kubenswrapper[4703]: I1011 04:08:54.129563 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-5985b768f7-2w5hq" Oct 11 04:08:54 crc kubenswrapper[4703]: I1011 04:08:54.599579 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-5985b768f7-2w5hq"] Oct 11 04:08:54 crc kubenswrapper[4703]: I1011 04:08:54.609049 4703 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 11 04:08:55 crc kubenswrapper[4703]: I1011 04:08:55.356253 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-5985b768f7-2w5hq" event={"ID":"6254e401-fe69-44a8-a16a-4423e12136bf","Type":"ContainerStarted","Data":"12cf5cd05801583a20cb4393083da2545f99d37167465ad570daaef6020bd8ca"} Oct 11 04:08:57 crc kubenswrapper[4703]: I1011 04:08:57.369563 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-5985b768f7-2w5hq" event={"ID":"6254e401-fe69-44a8-a16a-4423e12136bf","Type":"ContainerStarted","Data":"0cc8f054f6fe57edc08cb3fa044eab638addead28dec3fb06dd6bab225ea8b4b"} Oct 11 04:08:57 crc kubenswrapper[4703]: I1011 04:08:57.369867 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-5985b768f7-2w5hq" Oct 11 04:08:57 crc kubenswrapper[4703]: I1011 04:08:57.369885 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-5985b768f7-2w5hq" event={"ID":"6254e401-fe69-44a8-a16a-4423e12136bf","Type":"ContainerStarted","Data":"b6d9ec9503d6c188809a2c2dc7ba564fd7cd9c1882385c09d0c6becc242829e8"} Oct 11 04:08:57 crc kubenswrapper[4703]: I1011 04:08:57.400937 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-5985b768f7-2w5hq" podStartSLOduration=2.642980221 podStartE2EDuration="4.400918976s" podCreationTimestamp="2025-10-11 04:08:53 +0000 UTC" firstStartedPulling="2025-10-11 04:08:54.60883996 +0000 UTC m=+825.819321882" lastFinishedPulling="2025-10-11 04:08:56.366778715 +0000 UTC m=+827.577260637" observedRunningTime="2025-10-11 04:08:57.397895196 +0000 UTC m=+828.608377138" watchObservedRunningTime="2025-10-11 04:08:57.400918976 +0000 UTC m=+828.611400898" Oct 11 04:09:04 crc kubenswrapper[4703]: I1011 04:09:04.134453 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-5985b768f7-2w5hq" Oct 11 04:09:06 crc kubenswrapper[4703]: I1011 04:09:06.807379 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-5845cf79b9-bcr44"] Oct 11 04:09:06 crc kubenswrapper[4703]: I1011 04:09:06.808986 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-5845cf79b9-bcr44" Oct 11 04:09:06 crc kubenswrapper[4703]: I1011 04:09:06.811925 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-service-cert" Oct 11 04:09:06 crc kubenswrapper[4703]: I1011 04:09:06.811953 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-xwq24" Oct 11 04:09:06 crc kubenswrapper[4703]: I1011 04:09:06.860682 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-5845cf79b9-bcr44"] Oct 11 04:09:06 crc kubenswrapper[4703]: I1011 04:09:06.929147 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4mbtr\" (UniqueName: \"kubernetes.io/projected/9a73b45a-1271-42e4-9600-afee376337be-kube-api-access-4mbtr\") pod \"horizon-operator-controller-manager-5845cf79b9-bcr44\" (UID: \"9a73b45a-1271-42e4-9600-afee376337be\") " pod="openstack-operators/horizon-operator-controller-manager-5845cf79b9-bcr44" Oct 11 04:09:06 crc kubenswrapper[4703]: I1011 04:09:06.929313 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9a73b45a-1271-42e4-9600-afee376337be-apiservice-cert\") pod \"horizon-operator-controller-manager-5845cf79b9-bcr44\" (UID: \"9a73b45a-1271-42e4-9600-afee376337be\") " pod="openstack-operators/horizon-operator-controller-manager-5845cf79b9-bcr44" Oct 11 04:09:06 crc kubenswrapper[4703]: I1011 04:09:06.929342 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9a73b45a-1271-42e4-9600-afee376337be-webhook-cert\") pod \"horizon-operator-controller-manager-5845cf79b9-bcr44\" (UID: \"9a73b45a-1271-42e4-9600-afee376337be\") " pod="openstack-operators/horizon-operator-controller-manager-5845cf79b9-bcr44" Oct 11 04:09:07 crc kubenswrapper[4703]: I1011 04:09:07.030509 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9a73b45a-1271-42e4-9600-afee376337be-apiservice-cert\") pod \"horizon-operator-controller-manager-5845cf79b9-bcr44\" (UID: \"9a73b45a-1271-42e4-9600-afee376337be\") " pod="openstack-operators/horizon-operator-controller-manager-5845cf79b9-bcr44" Oct 11 04:09:07 crc kubenswrapper[4703]: I1011 04:09:07.030552 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9a73b45a-1271-42e4-9600-afee376337be-webhook-cert\") pod \"horizon-operator-controller-manager-5845cf79b9-bcr44\" (UID: \"9a73b45a-1271-42e4-9600-afee376337be\") " pod="openstack-operators/horizon-operator-controller-manager-5845cf79b9-bcr44" Oct 11 04:09:07 crc kubenswrapper[4703]: I1011 04:09:07.030582 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4mbtr\" (UniqueName: \"kubernetes.io/projected/9a73b45a-1271-42e4-9600-afee376337be-kube-api-access-4mbtr\") pod \"horizon-operator-controller-manager-5845cf79b9-bcr44\" (UID: \"9a73b45a-1271-42e4-9600-afee376337be\") " pod="openstack-operators/horizon-operator-controller-manager-5845cf79b9-bcr44" Oct 11 04:09:07 crc kubenswrapper[4703]: I1011 04:09:07.036395 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9a73b45a-1271-42e4-9600-afee376337be-webhook-cert\") pod \"horizon-operator-controller-manager-5845cf79b9-bcr44\" (UID: \"9a73b45a-1271-42e4-9600-afee376337be\") " pod="openstack-operators/horizon-operator-controller-manager-5845cf79b9-bcr44" Oct 11 04:09:07 crc kubenswrapper[4703]: I1011 04:09:07.047089 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9a73b45a-1271-42e4-9600-afee376337be-apiservice-cert\") pod \"horizon-operator-controller-manager-5845cf79b9-bcr44\" (UID: \"9a73b45a-1271-42e4-9600-afee376337be\") " pod="openstack-operators/horizon-operator-controller-manager-5845cf79b9-bcr44" Oct 11 04:09:07 crc kubenswrapper[4703]: I1011 04:09:07.057124 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4mbtr\" (UniqueName: \"kubernetes.io/projected/9a73b45a-1271-42e4-9600-afee376337be-kube-api-access-4mbtr\") pod \"horizon-operator-controller-manager-5845cf79b9-bcr44\" (UID: \"9a73b45a-1271-42e4-9600-afee376337be\") " pod="openstack-operators/horizon-operator-controller-manager-5845cf79b9-bcr44" Oct 11 04:09:07 crc kubenswrapper[4703]: I1011 04:09:07.127651 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-5845cf79b9-bcr44" Oct 11 04:09:07 crc kubenswrapper[4703]: I1011 04:09:07.499689 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-5845cf79b9-bcr44"] Oct 11 04:09:08 crc kubenswrapper[4703]: I1011 04:09:08.130566 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/keystone-76f545d5bb-kbdgh" Oct 11 04:09:08 crc kubenswrapper[4703]: I1011 04:09:08.440311 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-5845cf79b9-bcr44" event={"ID":"9a73b45a-1271-42e4-9600-afee376337be","Type":"ContainerStarted","Data":"761822b2850a59ff0919df743dc66f7972837cffaab5aab0dee6f83628119db5"} Oct 11 04:09:08 crc kubenswrapper[4703]: I1011 04:09:08.819125 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/swift-storage-0"] Oct 11 04:09:08 crc kubenswrapper[4703]: I1011 04:09:08.824213 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/swift-storage-0" Oct 11 04:09:08 crc kubenswrapper[4703]: I1011 04:09:08.828235 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"glance-kuttl-tests"/"swift-storage-config-data" Oct 11 04:09:08 crc kubenswrapper[4703]: I1011 04:09:08.828842 4703 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"swift-swift-dockercfg-hflrd" Oct 11 04:09:08 crc kubenswrapper[4703]: I1011 04:09:08.829292 4703 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"swift-conf" Oct 11 04:09:08 crc kubenswrapper[4703]: I1011 04:09:08.829712 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"glance-kuttl-tests"/"swift-ring-files" Oct 11 04:09:08 crc kubenswrapper[4703]: I1011 04:09:08.835770 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/swift-storage-0"] Oct 11 04:09:08 crc kubenswrapper[4703]: I1011 04:09:08.957799 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/edaa5850-8c67-4739-8aa9-b02eff7e3291-cache\") pod \"swift-storage-0\" (UID: \"edaa5850-8c67-4739-8aa9-b02eff7e3291\") " pod="glance-kuttl-tests/swift-storage-0" Oct 11 04:09:08 crc kubenswrapper[4703]: I1011 04:09:08.957877 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/edaa5850-8c67-4739-8aa9-b02eff7e3291-lock\") pod \"swift-storage-0\" (UID: \"edaa5850-8c67-4739-8aa9-b02eff7e3291\") " pod="glance-kuttl-tests/swift-storage-0" Oct 11 04:09:08 crc kubenswrapper[4703]: I1011 04:09:08.957897 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/edaa5850-8c67-4739-8aa9-b02eff7e3291-etc-swift\") pod \"swift-storage-0\" (UID: \"edaa5850-8c67-4739-8aa9-b02eff7e3291\") " pod="glance-kuttl-tests/swift-storage-0" Oct 11 04:09:08 crc kubenswrapper[4703]: I1011 04:09:08.957936 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r6285\" (UniqueName: \"kubernetes.io/projected/edaa5850-8c67-4739-8aa9-b02eff7e3291-kube-api-access-r6285\") pod \"swift-storage-0\" (UID: \"edaa5850-8c67-4739-8aa9-b02eff7e3291\") " pod="glance-kuttl-tests/swift-storage-0" Oct 11 04:09:08 crc kubenswrapper[4703]: I1011 04:09:08.957980 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"swift-storage-0\" (UID: \"edaa5850-8c67-4739-8aa9-b02eff7e3291\") " pod="glance-kuttl-tests/swift-storage-0" Oct 11 04:09:09 crc kubenswrapper[4703]: I1011 04:09:09.059500 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/edaa5850-8c67-4739-8aa9-b02eff7e3291-lock\") pod \"swift-storage-0\" (UID: \"edaa5850-8c67-4739-8aa9-b02eff7e3291\") " pod="glance-kuttl-tests/swift-storage-0" Oct 11 04:09:09 crc kubenswrapper[4703]: I1011 04:09:09.059830 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/edaa5850-8c67-4739-8aa9-b02eff7e3291-etc-swift\") pod \"swift-storage-0\" (UID: \"edaa5850-8c67-4739-8aa9-b02eff7e3291\") " pod="glance-kuttl-tests/swift-storage-0" Oct 11 04:09:09 crc kubenswrapper[4703]: I1011 04:09:09.059954 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r6285\" (UniqueName: \"kubernetes.io/projected/edaa5850-8c67-4739-8aa9-b02eff7e3291-kube-api-access-r6285\") pod \"swift-storage-0\" (UID: \"edaa5850-8c67-4739-8aa9-b02eff7e3291\") " pod="glance-kuttl-tests/swift-storage-0" Oct 11 04:09:09 crc kubenswrapper[4703]: E1011 04:09:09.060045 4703 projected.go:288] Couldn't get configMap glance-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Oct 11 04:09:09 crc kubenswrapper[4703]: E1011 04:09:09.060083 4703 projected.go:194] Error preparing data for projected volume etc-swift for pod glance-kuttl-tests/swift-storage-0: configmap "swift-ring-files" not found Oct 11 04:09:09 crc kubenswrapper[4703]: E1011 04:09:09.060152 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/edaa5850-8c67-4739-8aa9-b02eff7e3291-etc-swift podName:edaa5850-8c67-4739-8aa9-b02eff7e3291 nodeName:}" failed. No retries permitted until 2025-10-11 04:09:09.560129133 +0000 UTC m=+840.770611055 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/edaa5850-8c67-4739-8aa9-b02eff7e3291-etc-swift") pod "swift-storage-0" (UID: "edaa5850-8c67-4739-8aa9-b02eff7e3291") : configmap "swift-ring-files" not found Oct 11 04:09:09 crc kubenswrapper[4703]: I1011 04:09:09.060293 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"swift-storage-0\" (UID: \"edaa5850-8c67-4739-8aa9-b02eff7e3291\") " pod="glance-kuttl-tests/swift-storage-0" Oct 11 04:09:09 crc kubenswrapper[4703]: I1011 04:09:09.060512 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/edaa5850-8c67-4739-8aa9-b02eff7e3291-cache\") pod \"swift-storage-0\" (UID: \"edaa5850-8c67-4739-8aa9-b02eff7e3291\") " pod="glance-kuttl-tests/swift-storage-0" Oct 11 04:09:09 crc kubenswrapper[4703]: I1011 04:09:09.060783 4703 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"swift-storage-0\" (UID: \"edaa5850-8c67-4739-8aa9-b02eff7e3291\") device mount path \"/mnt/openstack/pv04\"" pod="glance-kuttl-tests/swift-storage-0" Oct 11 04:09:09 crc kubenswrapper[4703]: I1011 04:09:09.060852 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/edaa5850-8c67-4739-8aa9-b02eff7e3291-lock\") pod \"swift-storage-0\" (UID: \"edaa5850-8c67-4739-8aa9-b02eff7e3291\") " pod="glance-kuttl-tests/swift-storage-0" Oct 11 04:09:09 crc kubenswrapper[4703]: I1011 04:09:09.060923 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/edaa5850-8c67-4739-8aa9-b02eff7e3291-cache\") pod \"swift-storage-0\" (UID: \"edaa5850-8c67-4739-8aa9-b02eff7e3291\") " pod="glance-kuttl-tests/swift-storage-0" Oct 11 04:09:09 crc kubenswrapper[4703]: I1011 04:09:09.079352 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r6285\" (UniqueName: \"kubernetes.io/projected/edaa5850-8c67-4739-8aa9-b02eff7e3291-kube-api-access-r6285\") pod \"swift-storage-0\" (UID: \"edaa5850-8c67-4739-8aa9-b02eff7e3291\") " pod="glance-kuttl-tests/swift-storage-0" Oct 11 04:09:09 crc kubenswrapper[4703]: I1011 04:09:09.083140 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"swift-storage-0\" (UID: \"edaa5850-8c67-4739-8aa9-b02eff7e3291\") " pod="glance-kuttl-tests/swift-storage-0" Oct 11 04:09:09 crc kubenswrapper[4703]: I1011 04:09:09.566209 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/edaa5850-8c67-4739-8aa9-b02eff7e3291-etc-swift\") pod \"swift-storage-0\" (UID: \"edaa5850-8c67-4739-8aa9-b02eff7e3291\") " pod="glance-kuttl-tests/swift-storage-0" Oct 11 04:09:09 crc kubenswrapper[4703]: E1011 04:09:09.566391 4703 projected.go:288] Couldn't get configMap glance-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Oct 11 04:09:09 crc kubenswrapper[4703]: E1011 04:09:09.566674 4703 projected.go:194] Error preparing data for projected volume etc-swift for pod glance-kuttl-tests/swift-storage-0: configmap "swift-ring-files" not found Oct 11 04:09:09 crc kubenswrapper[4703]: E1011 04:09:09.566737 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/edaa5850-8c67-4739-8aa9-b02eff7e3291-etc-swift podName:edaa5850-8c67-4739-8aa9-b02eff7e3291 nodeName:}" failed. No retries permitted until 2025-10-11 04:09:10.566715188 +0000 UTC m=+841.777197110 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/edaa5850-8c67-4739-8aa9-b02eff7e3291-etc-swift") pod "swift-storage-0" (UID: "edaa5850-8c67-4739-8aa9-b02eff7e3291") : configmap "swift-ring-files" not found Oct 11 04:09:10 crc kubenswrapper[4703]: I1011 04:09:10.267352 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-index-7gwgv"] Oct 11 04:09:10 crc kubenswrapper[4703]: I1011 04:09:10.268270 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-index-7gwgv" Oct 11 04:09:10 crc kubenswrapper[4703]: I1011 04:09:10.269973 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-index-dockercfg-6rtvj" Oct 11 04:09:10 crc kubenswrapper[4703]: I1011 04:09:10.274061 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-index-7gwgv"] Oct 11 04:09:10 crc kubenswrapper[4703]: I1011 04:09:10.417184 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5pmfw\" (UniqueName: \"kubernetes.io/projected/30afd833-f0d2-4249-9a5b-2c318cef5220-kube-api-access-5pmfw\") pod \"glance-operator-index-7gwgv\" (UID: \"30afd833-f0d2-4249-9a5b-2c318cef5220\") " pod="openstack-operators/glance-operator-index-7gwgv" Oct 11 04:09:10 crc kubenswrapper[4703]: I1011 04:09:10.455816 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-5845cf79b9-bcr44" event={"ID":"9a73b45a-1271-42e4-9600-afee376337be","Type":"ContainerStarted","Data":"2053b0faf9aa7b8b23b369ab631b1676b5eadc4477e2281ce44565d83aee3e66"} Oct 11 04:09:10 crc kubenswrapper[4703]: I1011 04:09:10.455862 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-5845cf79b9-bcr44" event={"ID":"9a73b45a-1271-42e4-9600-afee376337be","Type":"ContainerStarted","Data":"f7adb57370690b63194f482fa35a9bd6351842de0c151ec1037450324eac8eb5"} Oct 11 04:09:10 crc kubenswrapper[4703]: I1011 04:09:10.456763 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-5845cf79b9-bcr44" Oct 11 04:09:10 crc kubenswrapper[4703]: I1011 04:09:10.518699 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5pmfw\" (UniqueName: \"kubernetes.io/projected/30afd833-f0d2-4249-9a5b-2c318cef5220-kube-api-access-5pmfw\") pod \"glance-operator-index-7gwgv\" (UID: \"30afd833-f0d2-4249-9a5b-2c318cef5220\") " pod="openstack-operators/glance-operator-index-7gwgv" Oct 11 04:09:10 crc kubenswrapper[4703]: I1011 04:09:10.538945 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5pmfw\" (UniqueName: \"kubernetes.io/projected/30afd833-f0d2-4249-9a5b-2c318cef5220-kube-api-access-5pmfw\") pod \"glance-operator-index-7gwgv\" (UID: \"30afd833-f0d2-4249-9a5b-2c318cef5220\") " pod="openstack-operators/glance-operator-index-7gwgv" Oct 11 04:09:10 crc kubenswrapper[4703]: I1011 04:09:10.619842 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/edaa5850-8c67-4739-8aa9-b02eff7e3291-etc-swift\") pod \"swift-storage-0\" (UID: \"edaa5850-8c67-4739-8aa9-b02eff7e3291\") " pod="glance-kuttl-tests/swift-storage-0" Oct 11 04:09:10 crc kubenswrapper[4703]: E1011 04:09:10.620054 4703 projected.go:288] Couldn't get configMap glance-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Oct 11 04:09:10 crc kubenswrapper[4703]: E1011 04:09:10.620073 4703 projected.go:194] Error preparing data for projected volume etc-swift for pod glance-kuttl-tests/swift-storage-0: configmap "swift-ring-files" not found Oct 11 04:09:10 crc kubenswrapper[4703]: E1011 04:09:10.620127 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/edaa5850-8c67-4739-8aa9-b02eff7e3291-etc-swift podName:edaa5850-8c67-4739-8aa9-b02eff7e3291 nodeName:}" failed. No retries permitted until 2025-10-11 04:09:12.620108106 +0000 UTC m=+843.830590038 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/edaa5850-8c67-4739-8aa9-b02eff7e3291-etc-swift") pod "swift-storage-0" (UID: "edaa5850-8c67-4739-8aa9-b02eff7e3291") : configmap "swift-ring-files" not found Oct 11 04:09:10 crc kubenswrapper[4703]: I1011 04:09:10.622208 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-index-7gwgv" Oct 11 04:09:11 crc kubenswrapper[4703]: I1011 04:09:11.132608 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-5845cf79b9-bcr44" podStartSLOduration=3.203794564 podStartE2EDuration="5.132585916s" podCreationTimestamp="2025-10-11 04:09:06 +0000 UTC" firstStartedPulling="2025-10-11 04:09:07.521030769 +0000 UTC m=+838.731512691" lastFinishedPulling="2025-10-11 04:09:09.449822121 +0000 UTC m=+840.660304043" observedRunningTime="2025-10-11 04:09:10.480393639 +0000 UTC m=+841.690875561" watchObservedRunningTime="2025-10-11 04:09:11.132585916 +0000 UTC m=+842.343067848" Oct 11 04:09:11 crc kubenswrapper[4703]: I1011 04:09:11.136044 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-index-7gwgv"] Oct 11 04:09:11 crc kubenswrapper[4703]: W1011 04:09:11.153711 4703 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod30afd833_f0d2_4249_9a5b_2c318cef5220.slice/crio-656db4c5ef344028d643ff74fb0dc80c699944347781f9da45e88be9eed6cdda WatchSource:0}: Error finding container 656db4c5ef344028d643ff74fb0dc80c699944347781f9da45e88be9eed6cdda: Status 404 returned error can't find the container with id 656db4c5ef344028d643ff74fb0dc80c699944347781f9da45e88be9eed6cdda Oct 11 04:09:11 crc kubenswrapper[4703]: I1011 04:09:11.466412 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-index-7gwgv" event={"ID":"30afd833-f0d2-4249-9a5b-2c318cef5220","Type":"ContainerStarted","Data":"656db4c5ef344028d643ff74fb0dc80c699944347781f9da45e88be9eed6cdda"} Oct 11 04:09:12 crc kubenswrapper[4703]: I1011 04:09:12.661263 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/edaa5850-8c67-4739-8aa9-b02eff7e3291-etc-swift\") pod \"swift-storage-0\" (UID: \"edaa5850-8c67-4739-8aa9-b02eff7e3291\") " pod="glance-kuttl-tests/swift-storage-0" Oct 11 04:09:12 crc kubenswrapper[4703]: E1011 04:09:12.661744 4703 projected.go:288] Couldn't get configMap glance-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Oct 11 04:09:12 crc kubenswrapper[4703]: E1011 04:09:12.661758 4703 projected.go:194] Error preparing data for projected volume etc-swift for pod glance-kuttl-tests/swift-storage-0: configmap "swift-ring-files" not found Oct 11 04:09:12 crc kubenswrapper[4703]: E1011 04:09:12.661803 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/edaa5850-8c67-4739-8aa9-b02eff7e3291-etc-swift podName:edaa5850-8c67-4739-8aa9-b02eff7e3291 nodeName:}" failed. No retries permitted until 2025-10-11 04:09:16.66178799 +0000 UTC m=+847.872269912 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/edaa5850-8c67-4739-8aa9-b02eff7e3291-etc-swift") pod "swift-storage-0" (UID: "edaa5850-8c67-4739-8aa9-b02eff7e3291") : configmap "swift-ring-files" not found Oct 11 04:09:12 crc kubenswrapper[4703]: I1011 04:09:12.780355 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/swift-ring-rebalance-wfmd2"] Oct 11 04:09:12 crc kubenswrapper[4703]: I1011 04:09:12.781766 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/swift-ring-rebalance-wfmd2" Oct 11 04:09:12 crc kubenswrapper[4703]: I1011 04:09:12.785219 4703 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"swift-proxy-config-data" Oct 11 04:09:12 crc kubenswrapper[4703]: I1011 04:09:12.785324 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"glance-kuttl-tests"/"swift-ring-config-data" Oct 11 04:09:12 crc kubenswrapper[4703]: I1011 04:09:12.789702 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"glance-kuttl-tests"/"swift-ring-scripts" Oct 11 04:09:12 crc kubenswrapper[4703]: I1011 04:09:12.809868 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/swift-ring-rebalance-wfmd2"] Oct 11 04:09:12 crc kubenswrapper[4703]: E1011 04:09:12.810411 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[dispersionconf etc-swift kube-api-access-t2qww ring-data-devices scripts swiftconf], unattached volumes=[], failed to process volumes=[dispersionconf etc-swift kube-api-access-t2qww ring-data-devices scripts swiftconf]: context canceled" pod="glance-kuttl-tests/swift-ring-rebalance-wfmd2" podUID="85b146a5-76d4-4bce-87bc-483859cf7459" Oct 11 04:09:12 crc kubenswrapper[4703]: I1011 04:09:12.842020 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/swift-ring-rebalance-wfmd2"] Oct 11 04:09:12 crc kubenswrapper[4703]: I1011 04:09:12.848028 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/swift-ring-rebalance-q8xj5"] Oct 11 04:09:12 crc kubenswrapper[4703]: I1011 04:09:12.849012 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/swift-ring-rebalance-q8xj5" Oct 11 04:09:12 crc kubenswrapper[4703]: I1011 04:09:12.864615 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/85b146a5-76d4-4bce-87bc-483859cf7459-ring-data-devices\") pod \"swift-ring-rebalance-wfmd2\" (UID: \"85b146a5-76d4-4bce-87bc-483859cf7459\") " pod="glance-kuttl-tests/swift-ring-rebalance-wfmd2" Oct 11 04:09:12 crc kubenswrapper[4703]: I1011 04:09:12.864663 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/76b1c0ac-2d3b-4d7b-ab5e-61044e6c92bd-dispersionconf\") pod \"swift-ring-rebalance-q8xj5\" (UID: \"76b1c0ac-2d3b-4d7b-ab5e-61044e6c92bd\") " pod="glance-kuttl-tests/swift-ring-rebalance-q8xj5" Oct 11 04:09:12 crc kubenswrapper[4703]: I1011 04:09:12.864704 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/76b1c0ac-2d3b-4d7b-ab5e-61044e6c92bd-scripts\") pod \"swift-ring-rebalance-q8xj5\" (UID: \"76b1c0ac-2d3b-4d7b-ab5e-61044e6c92bd\") " pod="glance-kuttl-tests/swift-ring-rebalance-q8xj5" Oct 11 04:09:12 crc kubenswrapper[4703]: I1011 04:09:12.864732 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/76b1c0ac-2d3b-4d7b-ab5e-61044e6c92bd-ring-data-devices\") pod \"swift-ring-rebalance-q8xj5\" (UID: \"76b1c0ac-2d3b-4d7b-ab5e-61044e6c92bd\") " pod="glance-kuttl-tests/swift-ring-rebalance-q8xj5" Oct 11 04:09:12 crc kubenswrapper[4703]: I1011 04:09:12.864754 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ll794\" (UniqueName: \"kubernetes.io/projected/76b1c0ac-2d3b-4d7b-ab5e-61044e6c92bd-kube-api-access-ll794\") pod \"swift-ring-rebalance-q8xj5\" (UID: \"76b1c0ac-2d3b-4d7b-ab5e-61044e6c92bd\") " pod="glance-kuttl-tests/swift-ring-rebalance-q8xj5" Oct 11 04:09:12 crc kubenswrapper[4703]: I1011 04:09:12.864777 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/76b1c0ac-2d3b-4d7b-ab5e-61044e6c92bd-swiftconf\") pod \"swift-ring-rebalance-q8xj5\" (UID: \"76b1c0ac-2d3b-4d7b-ab5e-61044e6c92bd\") " pod="glance-kuttl-tests/swift-ring-rebalance-q8xj5" Oct 11 04:09:12 crc kubenswrapper[4703]: I1011 04:09:12.864935 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/76b1c0ac-2d3b-4d7b-ab5e-61044e6c92bd-etc-swift\") pod \"swift-ring-rebalance-q8xj5\" (UID: \"76b1c0ac-2d3b-4d7b-ab5e-61044e6c92bd\") " pod="glance-kuttl-tests/swift-ring-rebalance-q8xj5" Oct 11 04:09:12 crc kubenswrapper[4703]: I1011 04:09:12.864991 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/85b146a5-76d4-4bce-87bc-483859cf7459-etc-swift\") pod \"swift-ring-rebalance-wfmd2\" (UID: \"85b146a5-76d4-4bce-87bc-483859cf7459\") " pod="glance-kuttl-tests/swift-ring-rebalance-wfmd2" Oct 11 04:09:12 crc kubenswrapper[4703]: I1011 04:09:12.865139 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/85b146a5-76d4-4bce-87bc-483859cf7459-scripts\") pod \"swift-ring-rebalance-wfmd2\" (UID: \"85b146a5-76d4-4bce-87bc-483859cf7459\") " pod="glance-kuttl-tests/swift-ring-rebalance-wfmd2" Oct 11 04:09:12 crc kubenswrapper[4703]: I1011 04:09:12.865290 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/85b146a5-76d4-4bce-87bc-483859cf7459-swiftconf\") pod \"swift-ring-rebalance-wfmd2\" (UID: \"85b146a5-76d4-4bce-87bc-483859cf7459\") " pod="glance-kuttl-tests/swift-ring-rebalance-wfmd2" Oct 11 04:09:12 crc kubenswrapper[4703]: I1011 04:09:12.865340 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t2qww\" (UniqueName: \"kubernetes.io/projected/85b146a5-76d4-4bce-87bc-483859cf7459-kube-api-access-t2qww\") pod \"swift-ring-rebalance-wfmd2\" (UID: \"85b146a5-76d4-4bce-87bc-483859cf7459\") " pod="glance-kuttl-tests/swift-ring-rebalance-wfmd2" Oct 11 04:09:12 crc kubenswrapper[4703]: I1011 04:09:12.865361 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/85b146a5-76d4-4bce-87bc-483859cf7459-dispersionconf\") pod \"swift-ring-rebalance-wfmd2\" (UID: \"85b146a5-76d4-4bce-87bc-483859cf7459\") " pod="glance-kuttl-tests/swift-ring-rebalance-wfmd2" Oct 11 04:09:12 crc kubenswrapper[4703]: I1011 04:09:12.875976 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/swift-ring-rebalance-q8xj5"] Oct 11 04:09:12 crc kubenswrapper[4703]: I1011 04:09:12.966535 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ll794\" (UniqueName: \"kubernetes.io/projected/76b1c0ac-2d3b-4d7b-ab5e-61044e6c92bd-kube-api-access-ll794\") pod \"swift-ring-rebalance-q8xj5\" (UID: \"76b1c0ac-2d3b-4d7b-ab5e-61044e6c92bd\") " pod="glance-kuttl-tests/swift-ring-rebalance-q8xj5" Oct 11 04:09:12 crc kubenswrapper[4703]: I1011 04:09:12.966590 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/76b1c0ac-2d3b-4d7b-ab5e-61044e6c92bd-ring-data-devices\") pod \"swift-ring-rebalance-q8xj5\" (UID: \"76b1c0ac-2d3b-4d7b-ab5e-61044e6c92bd\") " pod="glance-kuttl-tests/swift-ring-rebalance-q8xj5" Oct 11 04:09:12 crc kubenswrapper[4703]: I1011 04:09:12.966620 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/76b1c0ac-2d3b-4d7b-ab5e-61044e6c92bd-swiftconf\") pod \"swift-ring-rebalance-q8xj5\" (UID: \"76b1c0ac-2d3b-4d7b-ab5e-61044e6c92bd\") " pod="glance-kuttl-tests/swift-ring-rebalance-q8xj5" Oct 11 04:09:12 crc kubenswrapper[4703]: I1011 04:09:12.966675 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/76b1c0ac-2d3b-4d7b-ab5e-61044e6c92bd-etc-swift\") pod \"swift-ring-rebalance-q8xj5\" (UID: \"76b1c0ac-2d3b-4d7b-ab5e-61044e6c92bd\") " pod="glance-kuttl-tests/swift-ring-rebalance-q8xj5" Oct 11 04:09:12 crc kubenswrapper[4703]: I1011 04:09:12.966705 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/85b146a5-76d4-4bce-87bc-483859cf7459-etc-swift\") pod \"swift-ring-rebalance-wfmd2\" (UID: \"85b146a5-76d4-4bce-87bc-483859cf7459\") " pod="glance-kuttl-tests/swift-ring-rebalance-wfmd2" Oct 11 04:09:12 crc kubenswrapper[4703]: I1011 04:09:12.966744 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/85b146a5-76d4-4bce-87bc-483859cf7459-scripts\") pod \"swift-ring-rebalance-wfmd2\" (UID: \"85b146a5-76d4-4bce-87bc-483859cf7459\") " pod="glance-kuttl-tests/swift-ring-rebalance-wfmd2" Oct 11 04:09:12 crc kubenswrapper[4703]: I1011 04:09:12.966790 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/85b146a5-76d4-4bce-87bc-483859cf7459-swiftconf\") pod \"swift-ring-rebalance-wfmd2\" (UID: \"85b146a5-76d4-4bce-87bc-483859cf7459\") " pod="glance-kuttl-tests/swift-ring-rebalance-wfmd2" Oct 11 04:09:12 crc kubenswrapper[4703]: I1011 04:09:12.966824 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t2qww\" (UniqueName: \"kubernetes.io/projected/85b146a5-76d4-4bce-87bc-483859cf7459-kube-api-access-t2qww\") pod \"swift-ring-rebalance-wfmd2\" (UID: \"85b146a5-76d4-4bce-87bc-483859cf7459\") " pod="glance-kuttl-tests/swift-ring-rebalance-wfmd2" Oct 11 04:09:12 crc kubenswrapper[4703]: I1011 04:09:12.966852 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/85b146a5-76d4-4bce-87bc-483859cf7459-dispersionconf\") pod \"swift-ring-rebalance-wfmd2\" (UID: \"85b146a5-76d4-4bce-87bc-483859cf7459\") " pod="glance-kuttl-tests/swift-ring-rebalance-wfmd2" Oct 11 04:09:12 crc kubenswrapper[4703]: I1011 04:09:12.966878 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/85b146a5-76d4-4bce-87bc-483859cf7459-ring-data-devices\") pod \"swift-ring-rebalance-wfmd2\" (UID: \"85b146a5-76d4-4bce-87bc-483859cf7459\") " pod="glance-kuttl-tests/swift-ring-rebalance-wfmd2" Oct 11 04:09:12 crc kubenswrapper[4703]: I1011 04:09:12.966902 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/76b1c0ac-2d3b-4d7b-ab5e-61044e6c92bd-dispersionconf\") pod \"swift-ring-rebalance-q8xj5\" (UID: \"76b1c0ac-2d3b-4d7b-ab5e-61044e6c92bd\") " pod="glance-kuttl-tests/swift-ring-rebalance-q8xj5" Oct 11 04:09:12 crc kubenswrapper[4703]: I1011 04:09:12.966938 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/76b1c0ac-2d3b-4d7b-ab5e-61044e6c92bd-scripts\") pod \"swift-ring-rebalance-q8xj5\" (UID: \"76b1c0ac-2d3b-4d7b-ab5e-61044e6c92bd\") " pod="glance-kuttl-tests/swift-ring-rebalance-q8xj5" Oct 11 04:09:12 crc kubenswrapper[4703]: I1011 04:09:12.967090 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/76b1c0ac-2d3b-4d7b-ab5e-61044e6c92bd-etc-swift\") pod \"swift-ring-rebalance-q8xj5\" (UID: \"76b1c0ac-2d3b-4d7b-ab5e-61044e6c92bd\") " pod="glance-kuttl-tests/swift-ring-rebalance-q8xj5" Oct 11 04:09:12 crc kubenswrapper[4703]: I1011 04:09:12.967236 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/76b1c0ac-2d3b-4d7b-ab5e-61044e6c92bd-ring-data-devices\") pod \"swift-ring-rebalance-q8xj5\" (UID: \"76b1c0ac-2d3b-4d7b-ab5e-61044e6c92bd\") " pod="glance-kuttl-tests/swift-ring-rebalance-q8xj5" Oct 11 04:09:12 crc kubenswrapper[4703]: I1011 04:09:12.967547 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/85b146a5-76d4-4bce-87bc-483859cf7459-scripts\") pod \"swift-ring-rebalance-wfmd2\" (UID: \"85b146a5-76d4-4bce-87bc-483859cf7459\") " pod="glance-kuttl-tests/swift-ring-rebalance-wfmd2" Oct 11 04:09:12 crc kubenswrapper[4703]: I1011 04:09:12.967570 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/76b1c0ac-2d3b-4d7b-ab5e-61044e6c92bd-scripts\") pod \"swift-ring-rebalance-q8xj5\" (UID: \"76b1c0ac-2d3b-4d7b-ab5e-61044e6c92bd\") " pod="glance-kuttl-tests/swift-ring-rebalance-q8xj5" Oct 11 04:09:12 crc kubenswrapper[4703]: I1011 04:09:12.967828 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/85b146a5-76d4-4bce-87bc-483859cf7459-etc-swift\") pod \"swift-ring-rebalance-wfmd2\" (UID: \"85b146a5-76d4-4bce-87bc-483859cf7459\") " pod="glance-kuttl-tests/swift-ring-rebalance-wfmd2" Oct 11 04:09:12 crc kubenswrapper[4703]: I1011 04:09:12.968060 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/85b146a5-76d4-4bce-87bc-483859cf7459-ring-data-devices\") pod \"swift-ring-rebalance-wfmd2\" (UID: \"85b146a5-76d4-4bce-87bc-483859cf7459\") " pod="glance-kuttl-tests/swift-ring-rebalance-wfmd2" Oct 11 04:09:12 crc kubenswrapper[4703]: I1011 04:09:12.981602 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/76b1c0ac-2d3b-4d7b-ab5e-61044e6c92bd-dispersionconf\") pod \"swift-ring-rebalance-q8xj5\" (UID: \"76b1c0ac-2d3b-4d7b-ab5e-61044e6c92bd\") " pod="glance-kuttl-tests/swift-ring-rebalance-q8xj5" Oct 11 04:09:12 crc kubenswrapper[4703]: I1011 04:09:12.981644 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/76b1c0ac-2d3b-4d7b-ab5e-61044e6c92bd-swiftconf\") pod \"swift-ring-rebalance-q8xj5\" (UID: \"76b1c0ac-2d3b-4d7b-ab5e-61044e6c92bd\") " pod="glance-kuttl-tests/swift-ring-rebalance-q8xj5" Oct 11 04:09:12 crc kubenswrapper[4703]: I1011 04:09:12.981602 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/85b146a5-76d4-4bce-87bc-483859cf7459-dispersionconf\") pod \"swift-ring-rebalance-wfmd2\" (UID: \"85b146a5-76d4-4bce-87bc-483859cf7459\") " pod="glance-kuttl-tests/swift-ring-rebalance-wfmd2" Oct 11 04:09:12 crc kubenswrapper[4703]: I1011 04:09:12.984446 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/85b146a5-76d4-4bce-87bc-483859cf7459-swiftconf\") pod \"swift-ring-rebalance-wfmd2\" (UID: \"85b146a5-76d4-4bce-87bc-483859cf7459\") " pod="glance-kuttl-tests/swift-ring-rebalance-wfmd2" Oct 11 04:09:12 crc kubenswrapper[4703]: I1011 04:09:12.985059 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ll794\" (UniqueName: \"kubernetes.io/projected/76b1c0ac-2d3b-4d7b-ab5e-61044e6c92bd-kube-api-access-ll794\") pod \"swift-ring-rebalance-q8xj5\" (UID: \"76b1c0ac-2d3b-4d7b-ab5e-61044e6c92bd\") " pod="glance-kuttl-tests/swift-ring-rebalance-q8xj5" Oct 11 04:09:12 crc kubenswrapper[4703]: I1011 04:09:12.987652 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t2qww\" (UniqueName: \"kubernetes.io/projected/85b146a5-76d4-4bce-87bc-483859cf7459-kube-api-access-t2qww\") pod \"swift-ring-rebalance-wfmd2\" (UID: \"85b146a5-76d4-4bce-87bc-483859cf7459\") " pod="glance-kuttl-tests/swift-ring-rebalance-wfmd2" Oct 11 04:09:13 crc kubenswrapper[4703]: I1011 04:09:13.174735 4703 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"swift-swift-dockercfg-hflrd" Oct 11 04:09:13 crc kubenswrapper[4703]: I1011 04:09:13.183042 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/swift-ring-rebalance-q8xj5" Oct 11 04:09:13 crc kubenswrapper[4703]: I1011 04:09:13.478907 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/swift-ring-rebalance-wfmd2" Oct 11 04:09:13 crc kubenswrapper[4703]: I1011 04:09:13.478914 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-index-7gwgv" event={"ID":"30afd833-f0d2-4249-9a5b-2c318cef5220","Type":"ContainerStarted","Data":"34aa6a348e9b655450bd4f47681efff161562d2031f2c8e6433ca6df5a49145f"} Oct 11 04:09:13 crc kubenswrapper[4703]: I1011 04:09:13.487307 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/swift-ring-rebalance-wfmd2" Oct 11 04:09:13 crc kubenswrapper[4703]: I1011 04:09:13.577310 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/85b146a5-76d4-4bce-87bc-483859cf7459-swiftconf\") pod \"85b146a5-76d4-4bce-87bc-483859cf7459\" (UID: \"85b146a5-76d4-4bce-87bc-483859cf7459\") " Oct 11 04:09:13 crc kubenswrapper[4703]: I1011 04:09:13.577379 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t2qww\" (UniqueName: \"kubernetes.io/projected/85b146a5-76d4-4bce-87bc-483859cf7459-kube-api-access-t2qww\") pod \"85b146a5-76d4-4bce-87bc-483859cf7459\" (UID: \"85b146a5-76d4-4bce-87bc-483859cf7459\") " Oct 11 04:09:13 crc kubenswrapper[4703]: I1011 04:09:13.577503 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/85b146a5-76d4-4bce-87bc-483859cf7459-etc-swift\") pod \"85b146a5-76d4-4bce-87bc-483859cf7459\" (UID: \"85b146a5-76d4-4bce-87bc-483859cf7459\") " Oct 11 04:09:13 crc kubenswrapper[4703]: I1011 04:09:13.577530 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/85b146a5-76d4-4bce-87bc-483859cf7459-scripts\") pod \"85b146a5-76d4-4bce-87bc-483859cf7459\" (UID: \"85b146a5-76d4-4bce-87bc-483859cf7459\") " Oct 11 04:09:13 crc kubenswrapper[4703]: I1011 04:09:13.577546 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/85b146a5-76d4-4bce-87bc-483859cf7459-ring-data-devices\") pod \"85b146a5-76d4-4bce-87bc-483859cf7459\" (UID: \"85b146a5-76d4-4bce-87bc-483859cf7459\") " Oct 11 04:09:13 crc kubenswrapper[4703]: I1011 04:09:13.577572 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/85b146a5-76d4-4bce-87bc-483859cf7459-dispersionconf\") pod \"85b146a5-76d4-4bce-87bc-483859cf7459\" (UID: \"85b146a5-76d4-4bce-87bc-483859cf7459\") " Oct 11 04:09:13 crc kubenswrapper[4703]: I1011 04:09:13.577906 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/85b146a5-76d4-4bce-87bc-483859cf7459-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "85b146a5-76d4-4bce-87bc-483859cf7459" (UID: "85b146a5-76d4-4bce-87bc-483859cf7459"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 04:09:13 crc kubenswrapper[4703]: I1011 04:09:13.578066 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/85b146a5-76d4-4bce-87bc-483859cf7459-scripts" (OuterVolumeSpecName: "scripts") pod "85b146a5-76d4-4bce-87bc-483859cf7459" (UID: "85b146a5-76d4-4bce-87bc-483859cf7459"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 04:09:13 crc kubenswrapper[4703]: I1011 04:09:13.578136 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/85b146a5-76d4-4bce-87bc-483859cf7459-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "85b146a5-76d4-4bce-87bc-483859cf7459" (UID: "85b146a5-76d4-4bce-87bc-483859cf7459"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 04:09:13 crc kubenswrapper[4703]: I1011 04:09:13.578702 4703 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/85b146a5-76d4-4bce-87bc-483859cf7459-etc-swift\") on node \"crc\" DevicePath \"\"" Oct 11 04:09:13 crc kubenswrapper[4703]: I1011 04:09:13.578722 4703 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/85b146a5-76d4-4bce-87bc-483859cf7459-ring-data-devices\") on node \"crc\" DevicePath \"\"" Oct 11 04:09:13 crc kubenswrapper[4703]: I1011 04:09:13.578731 4703 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/85b146a5-76d4-4bce-87bc-483859cf7459-scripts\") on node \"crc\" DevicePath \"\"" Oct 11 04:09:13 crc kubenswrapper[4703]: I1011 04:09:13.583305 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/85b146a5-76d4-4bce-87bc-483859cf7459-kube-api-access-t2qww" (OuterVolumeSpecName: "kube-api-access-t2qww") pod "85b146a5-76d4-4bce-87bc-483859cf7459" (UID: "85b146a5-76d4-4bce-87bc-483859cf7459"). InnerVolumeSpecName "kube-api-access-t2qww". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 04:09:13 crc kubenswrapper[4703]: I1011 04:09:13.583308 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/85b146a5-76d4-4bce-87bc-483859cf7459-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "85b146a5-76d4-4bce-87bc-483859cf7459" (UID: "85b146a5-76d4-4bce-87bc-483859cf7459"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 04:09:13 crc kubenswrapper[4703]: I1011 04:09:13.591575 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/85b146a5-76d4-4bce-87bc-483859cf7459-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "85b146a5-76d4-4bce-87bc-483859cf7459" (UID: "85b146a5-76d4-4bce-87bc-483859cf7459"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 04:09:13 crc kubenswrapper[4703]: I1011 04:09:13.627032 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-index-7gwgv" podStartSLOduration=1.834778231 podStartE2EDuration="3.627014408s" podCreationTimestamp="2025-10-11 04:09:10 +0000 UTC" firstStartedPulling="2025-10-11 04:09:11.155535861 +0000 UTC m=+842.366017783" lastFinishedPulling="2025-10-11 04:09:12.947772038 +0000 UTC m=+844.158253960" observedRunningTime="2025-10-11 04:09:13.504716019 +0000 UTC m=+844.715197941" watchObservedRunningTime="2025-10-11 04:09:13.627014408 +0000 UTC m=+844.837496330" Oct 11 04:09:13 crc kubenswrapper[4703]: I1011 04:09:13.633558 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/swift-ring-rebalance-q8xj5"] Oct 11 04:09:13 crc kubenswrapper[4703]: W1011 04:09:13.638220 4703 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod76b1c0ac_2d3b_4d7b_ab5e_61044e6c92bd.slice/crio-41f9b33da734e858f85c3197556dcca80a4cec0cb71ad6a8b8e889e56942e158 WatchSource:0}: Error finding container 41f9b33da734e858f85c3197556dcca80a4cec0cb71ad6a8b8e889e56942e158: Status 404 returned error can't find the container with id 41f9b33da734e858f85c3197556dcca80a4cec0cb71ad6a8b8e889e56942e158 Oct 11 04:09:13 crc kubenswrapper[4703]: I1011 04:09:13.680814 4703 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/85b146a5-76d4-4bce-87bc-483859cf7459-swiftconf\") on node \"crc\" DevicePath \"\"" Oct 11 04:09:13 crc kubenswrapper[4703]: I1011 04:09:13.680854 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t2qww\" (UniqueName: \"kubernetes.io/projected/85b146a5-76d4-4bce-87bc-483859cf7459-kube-api-access-t2qww\") on node \"crc\" DevicePath \"\"" Oct 11 04:09:13 crc kubenswrapper[4703]: I1011 04:09:13.680870 4703 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/85b146a5-76d4-4bce-87bc-483859cf7459-dispersionconf\") on node \"crc\" DevicePath \"\"" Oct 11 04:09:14 crc kubenswrapper[4703]: I1011 04:09:14.490584 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-ring-rebalance-q8xj5" event={"ID":"76b1c0ac-2d3b-4d7b-ab5e-61044e6c92bd","Type":"ContainerStarted","Data":"41f9b33da734e858f85c3197556dcca80a4cec0cb71ad6a8b8e889e56942e158"} Oct 11 04:09:14 crc kubenswrapper[4703]: I1011 04:09:14.490652 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/swift-ring-rebalance-wfmd2" Oct 11 04:09:14 crc kubenswrapper[4703]: I1011 04:09:14.532793 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/swift-ring-rebalance-wfmd2"] Oct 11 04:09:14 crc kubenswrapper[4703]: I1011 04:09:14.538865 4703 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/swift-ring-rebalance-wfmd2"] Oct 11 04:09:15 crc kubenswrapper[4703]: I1011 04:09:15.541793 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="85b146a5-76d4-4bce-87bc-483859cf7459" path="/var/lib/kubelet/pods/85b146a5-76d4-4bce-87bc-483859cf7459/volumes" Oct 11 04:09:15 crc kubenswrapper[4703]: I1011 04:09:15.542147 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/swift-proxy-6dd8f59749-c7s8q"] Oct 11 04:09:15 crc kubenswrapper[4703]: I1011 04:09:15.543342 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/swift-proxy-6dd8f59749-c7s8q" Oct 11 04:09:15 crc kubenswrapper[4703]: I1011 04:09:15.554157 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/swift-proxy-6dd8f59749-c7s8q"] Oct 11 04:09:15 crc kubenswrapper[4703]: I1011 04:09:15.608212 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0cb8f8e-afe1-4bc0-af34-bd30cf77e7e6-config-data\") pod \"swift-proxy-6dd8f59749-c7s8q\" (UID: \"e0cb8f8e-afe1-4bc0-af34-bd30cf77e7e6\") " pod="glance-kuttl-tests/swift-proxy-6dd8f59749-c7s8q" Oct 11 04:09:15 crc kubenswrapper[4703]: I1011 04:09:15.608289 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e0cb8f8e-afe1-4bc0-af34-bd30cf77e7e6-log-httpd\") pod \"swift-proxy-6dd8f59749-c7s8q\" (UID: \"e0cb8f8e-afe1-4bc0-af34-bd30cf77e7e6\") " pod="glance-kuttl-tests/swift-proxy-6dd8f59749-c7s8q" Oct 11 04:09:15 crc kubenswrapper[4703]: I1011 04:09:15.608328 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/e0cb8f8e-afe1-4bc0-af34-bd30cf77e7e6-etc-swift\") pod \"swift-proxy-6dd8f59749-c7s8q\" (UID: \"e0cb8f8e-afe1-4bc0-af34-bd30cf77e7e6\") " pod="glance-kuttl-tests/swift-proxy-6dd8f59749-c7s8q" Oct 11 04:09:15 crc kubenswrapper[4703]: I1011 04:09:15.608344 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hlkzc\" (UniqueName: \"kubernetes.io/projected/e0cb8f8e-afe1-4bc0-af34-bd30cf77e7e6-kube-api-access-hlkzc\") pod \"swift-proxy-6dd8f59749-c7s8q\" (UID: \"e0cb8f8e-afe1-4bc0-af34-bd30cf77e7e6\") " pod="glance-kuttl-tests/swift-proxy-6dd8f59749-c7s8q" Oct 11 04:09:15 crc kubenswrapper[4703]: I1011 04:09:15.608370 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e0cb8f8e-afe1-4bc0-af34-bd30cf77e7e6-run-httpd\") pod \"swift-proxy-6dd8f59749-c7s8q\" (UID: \"e0cb8f8e-afe1-4bc0-af34-bd30cf77e7e6\") " pod="glance-kuttl-tests/swift-proxy-6dd8f59749-c7s8q" Oct 11 04:09:15 crc kubenswrapper[4703]: I1011 04:09:15.709318 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0cb8f8e-afe1-4bc0-af34-bd30cf77e7e6-config-data\") pod \"swift-proxy-6dd8f59749-c7s8q\" (UID: \"e0cb8f8e-afe1-4bc0-af34-bd30cf77e7e6\") " pod="glance-kuttl-tests/swift-proxy-6dd8f59749-c7s8q" Oct 11 04:09:15 crc kubenswrapper[4703]: I1011 04:09:15.709393 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e0cb8f8e-afe1-4bc0-af34-bd30cf77e7e6-log-httpd\") pod \"swift-proxy-6dd8f59749-c7s8q\" (UID: \"e0cb8f8e-afe1-4bc0-af34-bd30cf77e7e6\") " pod="glance-kuttl-tests/swift-proxy-6dd8f59749-c7s8q" Oct 11 04:09:15 crc kubenswrapper[4703]: I1011 04:09:15.709432 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/e0cb8f8e-afe1-4bc0-af34-bd30cf77e7e6-etc-swift\") pod \"swift-proxy-6dd8f59749-c7s8q\" (UID: \"e0cb8f8e-afe1-4bc0-af34-bd30cf77e7e6\") " pod="glance-kuttl-tests/swift-proxy-6dd8f59749-c7s8q" Oct 11 04:09:15 crc kubenswrapper[4703]: I1011 04:09:15.709447 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hlkzc\" (UniqueName: \"kubernetes.io/projected/e0cb8f8e-afe1-4bc0-af34-bd30cf77e7e6-kube-api-access-hlkzc\") pod \"swift-proxy-6dd8f59749-c7s8q\" (UID: \"e0cb8f8e-afe1-4bc0-af34-bd30cf77e7e6\") " pod="glance-kuttl-tests/swift-proxy-6dd8f59749-c7s8q" Oct 11 04:09:15 crc kubenswrapper[4703]: I1011 04:09:15.709491 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e0cb8f8e-afe1-4bc0-af34-bd30cf77e7e6-run-httpd\") pod \"swift-proxy-6dd8f59749-c7s8q\" (UID: \"e0cb8f8e-afe1-4bc0-af34-bd30cf77e7e6\") " pod="glance-kuttl-tests/swift-proxy-6dd8f59749-c7s8q" Oct 11 04:09:15 crc kubenswrapper[4703]: E1011 04:09:15.710238 4703 projected.go:288] Couldn't get configMap glance-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Oct 11 04:09:15 crc kubenswrapper[4703]: E1011 04:09:15.710261 4703 projected.go:194] Error preparing data for projected volume etc-swift for pod glance-kuttl-tests/swift-proxy-6dd8f59749-c7s8q: configmap "swift-ring-files" not found Oct 11 04:09:15 crc kubenswrapper[4703]: E1011 04:09:15.710311 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e0cb8f8e-afe1-4bc0-af34-bd30cf77e7e6-etc-swift podName:e0cb8f8e-afe1-4bc0-af34-bd30cf77e7e6 nodeName:}" failed. No retries permitted until 2025-10-11 04:09:16.210292386 +0000 UTC m=+847.420774308 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/e0cb8f8e-afe1-4bc0-af34-bd30cf77e7e6-etc-swift") pod "swift-proxy-6dd8f59749-c7s8q" (UID: "e0cb8f8e-afe1-4bc0-af34-bd30cf77e7e6") : configmap "swift-ring-files" not found Oct 11 04:09:15 crc kubenswrapper[4703]: I1011 04:09:15.710350 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e0cb8f8e-afe1-4bc0-af34-bd30cf77e7e6-run-httpd\") pod \"swift-proxy-6dd8f59749-c7s8q\" (UID: \"e0cb8f8e-afe1-4bc0-af34-bd30cf77e7e6\") " pod="glance-kuttl-tests/swift-proxy-6dd8f59749-c7s8q" Oct 11 04:09:15 crc kubenswrapper[4703]: I1011 04:09:15.710578 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e0cb8f8e-afe1-4bc0-af34-bd30cf77e7e6-log-httpd\") pod \"swift-proxy-6dd8f59749-c7s8q\" (UID: \"e0cb8f8e-afe1-4bc0-af34-bd30cf77e7e6\") " pod="glance-kuttl-tests/swift-proxy-6dd8f59749-c7s8q" Oct 11 04:09:15 crc kubenswrapper[4703]: I1011 04:09:15.714684 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0cb8f8e-afe1-4bc0-af34-bd30cf77e7e6-config-data\") pod \"swift-proxy-6dd8f59749-c7s8q\" (UID: \"e0cb8f8e-afe1-4bc0-af34-bd30cf77e7e6\") " pod="glance-kuttl-tests/swift-proxy-6dd8f59749-c7s8q" Oct 11 04:09:15 crc kubenswrapper[4703]: I1011 04:09:15.729564 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hlkzc\" (UniqueName: \"kubernetes.io/projected/e0cb8f8e-afe1-4bc0-af34-bd30cf77e7e6-kube-api-access-hlkzc\") pod \"swift-proxy-6dd8f59749-c7s8q\" (UID: \"e0cb8f8e-afe1-4bc0-af34-bd30cf77e7e6\") " pod="glance-kuttl-tests/swift-proxy-6dd8f59749-c7s8q" Oct 11 04:09:16 crc kubenswrapper[4703]: I1011 04:09:16.216036 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/e0cb8f8e-afe1-4bc0-af34-bd30cf77e7e6-etc-swift\") pod \"swift-proxy-6dd8f59749-c7s8q\" (UID: \"e0cb8f8e-afe1-4bc0-af34-bd30cf77e7e6\") " pod="glance-kuttl-tests/swift-proxy-6dd8f59749-c7s8q" Oct 11 04:09:16 crc kubenswrapper[4703]: E1011 04:09:16.216223 4703 projected.go:288] Couldn't get configMap glance-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Oct 11 04:09:16 crc kubenswrapper[4703]: E1011 04:09:16.216247 4703 projected.go:194] Error preparing data for projected volume etc-swift for pod glance-kuttl-tests/swift-proxy-6dd8f59749-c7s8q: configmap "swift-ring-files" not found Oct 11 04:09:16 crc kubenswrapper[4703]: E1011 04:09:16.216304 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e0cb8f8e-afe1-4bc0-af34-bd30cf77e7e6-etc-swift podName:e0cb8f8e-afe1-4bc0-af34-bd30cf77e7e6 nodeName:}" failed. No retries permitted until 2025-10-11 04:09:17.216284785 +0000 UTC m=+848.426766707 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/e0cb8f8e-afe1-4bc0-af34-bd30cf77e7e6-etc-swift") pod "swift-proxy-6dd8f59749-c7s8q" (UID: "e0cb8f8e-afe1-4bc0-af34-bd30cf77e7e6") : configmap "swift-ring-files" not found Oct 11 04:09:16 crc kubenswrapper[4703]: I1011 04:09:16.723043 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/edaa5850-8c67-4739-8aa9-b02eff7e3291-etc-swift\") pod \"swift-storage-0\" (UID: \"edaa5850-8c67-4739-8aa9-b02eff7e3291\") " pod="glance-kuttl-tests/swift-storage-0" Oct 11 04:09:16 crc kubenswrapper[4703]: E1011 04:09:16.723227 4703 projected.go:288] Couldn't get configMap glance-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Oct 11 04:09:16 crc kubenswrapper[4703]: E1011 04:09:16.723256 4703 projected.go:194] Error preparing data for projected volume etc-swift for pod glance-kuttl-tests/swift-storage-0: configmap "swift-ring-files" not found Oct 11 04:09:16 crc kubenswrapper[4703]: E1011 04:09:16.723315 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/edaa5850-8c67-4739-8aa9-b02eff7e3291-etc-swift podName:edaa5850-8c67-4739-8aa9-b02eff7e3291 nodeName:}" failed. No retries permitted until 2025-10-11 04:09:24.723290351 +0000 UTC m=+855.933772273 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/edaa5850-8c67-4739-8aa9-b02eff7e3291-etc-swift") pod "swift-storage-0" (UID: "edaa5850-8c67-4739-8aa9-b02eff7e3291") : configmap "swift-ring-files" not found Oct 11 04:09:17 crc kubenswrapper[4703]: I1011 04:09:17.131617 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-5845cf79b9-bcr44" Oct 11 04:09:17 crc kubenswrapper[4703]: I1011 04:09:17.230607 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/e0cb8f8e-afe1-4bc0-af34-bd30cf77e7e6-etc-swift\") pod \"swift-proxy-6dd8f59749-c7s8q\" (UID: \"e0cb8f8e-afe1-4bc0-af34-bd30cf77e7e6\") " pod="glance-kuttl-tests/swift-proxy-6dd8f59749-c7s8q" Oct 11 04:09:17 crc kubenswrapper[4703]: E1011 04:09:17.230776 4703 projected.go:288] Couldn't get configMap glance-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Oct 11 04:09:17 crc kubenswrapper[4703]: E1011 04:09:17.230789 4703 projected.go:194] Error preparing data for projected volume etc-swift for pod glance-kuttl-tests/swift-proxy-6dd8f59749-c7s8q: configmap "swift-ring-files" not found Oct 11 04:09:17 crc kubenswrapper[4703]: E1011 04:09:17.230843 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e0cb8f8e-afe1-4bc0-af34-bd30cf77e7e6-etc-swift podName:e0cb8f8e-afe1-4bc0-af34-bd30cf77e7e6 nodeName:}" failed. No retries permitted until 2025-10-11 04:09:19.230825292 +0000 UTC m=+850.441307214 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/e0cb8f8e-afe1-4bc0-af34-bd30cf77e7e6-etc-swift") pod "swift-proxy-6dd8f59749-c7s8q" (UID: "e0cb8f8e-afe1-4bc0-af34-bd30cf77e7e6") : configmap "swift-ring-files" not found Oct 11 04:09:17 crc kubenswrapper[4703]: I1011 04:09:17.509590 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-ring-rebalance-q8xj5" event={"ID":"76b1c0ac-2d3b-4d7b-ab5e-61044e6c92bd","Type":"ContainerStarted","Data":"6bac3b5de6042f1a1dbbcc1591f2541d6d0ebe1f476fd3478c6c3c554a751a8d"} Oct 11 04:09:19 crc kubenswrapper[4703]: I1011 04:09:19.262405 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/e0cb8f8e-afe1-4bc0-af34-bd30cf77e7e6-etc-swift\") pod \"swift-proxy-6dd8f59749-c7s8q\" (UID: \"e0cb8f8e-afe1-4bc0-af34-bd30cf77e7e6\") " pod="glance-kuttl-tests/swift-proxy-6dd8f59749-c7s8q" Oct 11 04:09:19 crc kubenswrapper[4703]: E1011 04:09:19.262837 4703 projected.go:288] Couldn't get configMap glance-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Oct 11 04:09:19 crc kubenswrapper[4703]: E1011 04:09:19.262890 4703 projected.go:194] Error preparing data for projected volume etc-swift for pod glance-kuttl-tests/swift-proxy-6dd8f59749-c7s8q: configmap "swift-ring-files" not found Oct 11 04:09:19 crc kubenswrapper[4703]: E1011 04:09:19.262991 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e0cb8f8e-afe1-4bc0-af34-bd30cf77e7e6-etc-swift podName:e0cb8f8e-afe1-4bc0-af34-bd30cf77e7e6 nodeName:}" failed. No retries permitted until 2025-10-11 04:09:23.262961474 +0000 UTC m=+854.473443436 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/e0cb8f8e-afe1-4bc0-af34-bd30cf77e7e6-etc-swift") pod "swift-proxy-6dd8f59749-c7s8q" (UID: "e0cb8f8e-afe1-4bc0-af34-bd30cf77e7e6") : configmap "swift-ring-files" not found Oct 11 04:09:20 crc kubenswrapper[4703]: I1011 04:09:20.623588 4703 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/glance-operator-index-7gwgv" Oct 11 04:09:20 crc kubenswrapper[4703]: I1011 04:09:20.623916 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-index-7gwgv" Oct 11 04:09:20 crc kubenswrapper[4703]: I1011 04:09:20.652449 4703 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/glance-operator-index-7gwgv" Oct 11 04:09:20 crc kubenswrapper[4703]: I1011 04:09:20.673798 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/swift-ring-rebalance-q8xj5" podStartSLOduration=5.406247719 podStartE2EDuration="8.673779551s" podCreationTimestamp="2025-10-11 04:09:12 +0000 UTC" firstStartedPulling="2025-10-11 04:09:13.639781034 +0000 UTC m=+844.850262956" lastFinishedPulling="2025-10-11 04:09:16.907312866 +0000 UTC m=+848.117794788" observedRunningTime="2025-10-11 04:09:17.534440614 +0000 UTC m=+848.744922546" watchObservedRunningTime="2025-10-11 04:09:20.673779551 +0000 UTC m=+851.884261473" Oct 11 04:09:21 crc kubenswrapper[4703]: I1011 04:09:21.571057 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-index-7gwgv" Oct 11 04:09:23 crc kubenswrapper[4703]: I1011 04:09:23.322586 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/e0cb8f8e-afe1-4bc0-af34-bd30cf77e7e6-etc-swift\") pod \"swift-proxy-6dd8f59749-c7s8q\" (UID: \"e0cb8f8e-afe1-4bc0-af34-bd30cf77e7e6\") " pod="glance-kuttl-tests/swift-proxy-6dd8f59749-c7s8q" Oct 11 04:09:23 crc kubenswrapper[4703]: E1011 04:09:23.322778 4703 projected.go:288] Couldn't get configMap glance-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Oct 11 04:09:23 crc kubenswrapper[4703]: E1011 04:09:23.323019 4703 projected.go:194] Error preparing data for projected volume etc-swift for pod glance-kuttl-tests/swift-proxy-6dd8f59749-c7s8q: configmap "swift-ring-files" not found Oct 11 04:09:23 crc kubenswrapper[4703]: E1011 04:09:23.323076 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e0cb8f8e-afe1-4bc0-af34-bd30cf77e7e6-etc-swift podName:e0cb8f8e-afe1-4bc0-af34-bd30cf77e7e6 nodeName:}" failed. No retries permitted until 2025-10-11 04:09:31.323056618 +0000 UTC m=+862.533538550 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/e0cb8f8e-afe1-4bc0-af34-bd30cf77e7e6-etc-swift") pod "swift-proxy-6dd8f59749-c7s8q" (UID: "e0cb8f8e-afe1-4bc0-af34-bd30cf77e7e6") : configmap "swift-ring-files" not found Oct 11 04:09:24 crc kubenswrapper[4703]: I1011 04:09:24.555371 4703 generic.go:334] "Generic (PLEG): container finished" podID="76b1c0ac-2d3b-4d7b-ab5e-61044e6c92bd" containerID="6bac3b5de6042f1a1dbbcc1591f2541d6d0ebe1f476fd3478c6c3c554a751a8d" exitCode=0 Oct 11 04:09:24 crc kubenswrapper[4703]: I1011 04:09:24.555498 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-ring-rebalance-q8xj5" event={"ID":"76b1c0ac-2d3b-4d7b-ab5e-61044e6c92bd","Type":"ContainerDied","Data":"6bac3b5de6042f1a1dbbcc1591f2541d6d0ebe1f476fd3478c6c3c554a751a8d"} Oct 11 04:09:24 crc kubenswrapper[4703]: I1011 04:09:24.742533 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/edaa5850-8c67-4739-8aa9-b02eff7e3291-etc-swift\") pod \"swift-storage-0\" (UID: \"edaa5850-8c67-4739-8aa9-b02eff7e3291\") " pod="glance-kuttl-tests/swift-storage-0" Oct 11 04:09:24 crc kubenswrapper[4703]: I1011 04:09:24.755073 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/edaa5850-8c67-4739-8aa9-b02eff7e3291-etc-swift\") pod \"swift-storage-0\" (UID: \"edaa5850-8c67-4739-8aa9-b02eff7e3291\") " pod="glance-kuttl-tests/swift-storage-0" Oct 11 04:09:25 crc kubenswrapper[4703]: I1011 04:09:25.044038 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/swift-storage-0" Oct 11 04:09:25 crc kubenswrapper[4703]: W1011 04:09:25.543121 4703 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podedaa5850_8c67_4739_8aa9_b02eff7e3291.slice/crio-97b6e83c17f69d7656075a5bbbdc74a84a37c9b113e490b694fcf93158a36296 WatchSource:0}: Error finding container 97b6e83c17f69d7656075a5bbbdc74a84a37c9b113e490b694fcf93158a36296: Status 404 returned error can't find the container with id 97b6e83c17f69d7656075a5bbbdc74a84a37c9b113e490b694fcf93158a36296 Oct 11 04:09:25 crc kubenswrapper[4703]: I1011 04:09:25.554560 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/swift-storage-0"] Oct 11 04:09:25 crc kubenswrapper[4703]: I1011 04:09:25.565579 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"edaa5850-8c67-4739-8aa9-b02eff7e3291","Type":"ContainerStarted","Data":"97b6e83c17f69d7656075a5bbbdc74a84a37c9b113e490b694fcf93158a36296"} Oct 11 04:09:25 crc kubenswrapper[4703]: I1011 04:09:25.872162 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/swift-ring-rebalance-q8xj5" Oct 11 04:09:25 crc kubenswrapper[4703]: I1011 04:09:25.959958 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ll794\" (UniqueName: \"kubernetes.io/projected/76b1c0ac-2d3b-4d7b-ab5e-61044e6c92bd-kube-api-access-ll794\") pod \"76b1c0ac-2d3b-4d7b-ab5e-61044e6c92bd\" (UID: \"76b1c0ac-2d3b-4d7b-ab5e-61044e6c92bd\") " Oct 11 04:09:25 crc kubenswrapper[4703]: I1011 04:09:25.960120 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/76b1c0ac-2d3b-4d7b-ab5e-61044e6c92bd-swiftconf\") pod \"76b1c0ac-2d3b-4d7b-ab5e-61044e6c92bd\" (UID: \"76b1c0ac-2d3b-4d7b-ab5e-61044e6c92bd\") " Oct 11 04:09:25 crc kubenswrapper[4703]: I1011 04:09:25.960189 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/76b1c0ac-2d3b-4d7b-ab5e-61044e6c92bd-dispersionconf\") pod \"76b1c0ac-2d3b-4d7b-ab5e-61044e6c92bd\" (UID: \"76b1c0ac-2d3b-4d7b-ab5e-61044e6c92bd\") " Oct 11 04:09:25 crc kubenswrapper[4703]: I1011 04:09:25.960232 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/76b1c0ac-2d3b-4d7b-ab5e-61044e6c92bd-scripts\") pod \"76b1c0ac-2d3b-4d7b-ab5e-61044e6c92bd\" (UID: \"76b1c0ac-2d3b-4d7b-ab5e-61044e6c92bd\") " Oct 11 04:09:25 crc kubenswrapper[4703]: I1011 04:09:25.960253 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/76b1c0ac-2d3b-4d7b-ab5e-61044e6c92bd-ring-data-devices\") pod \"76b1c0ac-2d3b-4d7b-ab5e-61044e6c92bd\" (UID: \"76b1c0ac-2d3b-4d7b-ab5e-61044e6c92bd\") " Oct 11 04:09:25 crc kubenswrapper[4703]: I1011 04:09:25.960294 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/76b1c0ac-2d3b-4d7b-ab5e-61044e6c92bd-etc-swift\") pod \"76b1c0ac-2d3b-4d7b-ab5e-61044e6c92bd\" (UID: \"76b1c0ac-2d3b-4d7b-ab5e-61044e6c92bd\") " Oct 11 04:09:25 crc kubenswrapper[4703]: I1011 04:09:25.960701 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/76b1c0ac-2d3b-4d7b-ab5e-61044e6c92bd-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "76b1c0ac-2d3b-4d7b-ab5e-61044e6c92bd" (UID: "76b1c0ac-2d3b-4d7b-ab5e-61044e6c92bd"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 04:09:25 crc kubenswrapper[4703]: I1011 04:09:25.961226 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/76b1c0ac-2d3b-4d7b-ab5e-61044e6c92bd-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "76b1c0ac-2d3b-4d7b-ab5e-61044e6c92bd" (UID: "76b1c0ac-2d3b-4d7b-ab5e-61044e6c92bd"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 04:09:25 crc kubenswrapper[4703]: I1011 04:09:25.965937 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/76b1c0ac-2d3b-4d7b-ab5e-61044e6c92bd-kube-api-access-ll794" (OuterVolumeSpecName: "kube-api-access-ll794") pod "76b1c0ac-2d3b-4d7b-ab5e-61044e6c92bd" (UID: "76b1c0ac-2d3b-4d7b-ab5e-61044e6c92bd"). InnerVolumeSpecName "kube-api-access-ll794". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 04:09:25 crc kubenswrapper[4703]: I1011 04:09:25.968751 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/76b1c0ac-2d3b-4d7b-ab5e-61044e6c92bd-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "76b1c0ac-2d3b-4d7b-ab5e-61044e6c92bd" (UID: "76b1c0ac-2d3b-4d7b-ab5e-61044e6c92bd"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 04:09:25 crc kubenswrapper[4703]: I1011 04:09:25.980895 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/76b1c0ac-2d3b-4d7b-ab5e-61044e6c92bd-scripts" (OuterVolumeSpecName: "scripts") pod "76b1c0ac-2d3b-4d7b-ab5e-61044e6c92bd" (UID: "76b1c0ac-2d3b-4d7b-ab5e-61044e6c92bd"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 04:09:25 crc kubenswrapper[4703]: I1011 04:09:25.985063 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/76b1c0ac-2d3b-4d7b-ab5e-61044e6c92bd-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "76b1c0ac-2d3b-4d7b-ab5e-61044e6c92bd" (UID: "76b1c0ac-2d3b-4d7b-ab5e-61044e6c92bd"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 04:09:26 crc kubenswrapper[4703]: I1011 04:09:26.062152 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ll794\" (UniqueName: \"kubernetes.io/projected/76b1c0ac-2d3b-4d7b-ab5e-61044e6c92bd-kube-api-access-ll794\") on node \"crc\" DevicePath \"\"" Oct 11 04:09:26 crc kubenswrapper[4703]: I1011 04:09:26.062187 4703 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/76b1c0ac-2d3b-4d7b-ab5e-61044e6c92bd-swiftconf\") on node \"crc\" DevicePath \"\"" Oct 11 04:09:26 crc kubenswrapper[4703]: I1011 04:09:26.062222 4703 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/76b1c0ac-2d3b-4d7b-ab5e-61044e6c92bd-dispersionconf\") on node \"crc\" DevicePath \"\"" Oct 11 04:09:26 crc kubenswrapper[4703]: I1011 04:09:26.062230 4703 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/76b1c0ac-2d3b-4d7b-ab5e-61044e6c92bd-scripts\") on node \"crc\" DevicePath \"\"" Oct 11 04:09:26 crc kubenswrapper[4703]: I1011 04:09:26.062239 4703 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/76b1c0ac-2d3b-4d7b-ab5e-61044e6c92bd-ring-data-devices\") on node \"crc\" DevicePath \"\"" Oct 11 04:09:26 crc kubenswrapper[4703]: I1011 04:09:26.062247 4703 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/76b1c0ac-2d3b-4d7b-ab5e-61044e6c92bd-etc-swift\") on node \"crc\" DevicePath \"\"" Oct 11 04:09:26 crc kubenswrapper[4703]: I1011 04:09:26.578357 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-ring-rebalance-q8xj5" event={"ID":"76b1c0ac-2d3b-4d7b-ab5e-61044e6c92bd","Type":"ContainerDied","Data":"41f9b33da734e858f85c3197556dcca80a4cec0cb71ad6a8b8e889e56942e158"} Oct 11 04:09:26 crc kubenswrapper[4703]: I1011 04:09:26.578399 4703 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="41f9b33da734e858f85c3197556dcca80a4cec0cb71ad6a8b8e889e56942e158" Oct 11 04:09:26 crc kubenswrapper[4703]: I1011 04:09:26.578401 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/swift-ring-rebalance-q8xj5" Oct 11 04:09:27 crc kubenswrapper[4703]: I1011 04:09:27.590736 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"edaa5850-8c67-4739-8aa9-b02eff7e3291","Type":"ContainerStarted","Data":"a6919198b05def229682581b39d5cbce020db19af71fc7b5c156a248e96d6ee1"} Oct 11 04:09:27 crc kubenswrapper[4703]: I1011 04:09:27.591067 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"edaa5850-8c67-4739-8aa9-b02eff7e3291","Type":"ContainerStarted","Data":"105ecf8600cdcb75003dc14d034ae0b04738476fb5f48ddb990b10a4f4f3fbb0"} Oct 11 04:09:27 crc kubenswrapper[4703]: I1011 04:09:27.591087 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"edaa5850-8c67-4739-8aa9-b02eff7e3291","Type":"ContainerStarted","Data":"095ff4da24f60ee4ac9070735b1306b86c552793780bc851b43a3d227a548c89"} Oct 11 04:09:27 crc kubenswrapper[4703]: I1011 04:09:27.591105 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"edaa5850-8c67-4739-8aa9-b02eff7e3291","Type":"ContainerStarted","Data":"21474667bd1550746bec32d60232b79ea103597a69863fc67ea3812eaad496c8"} Oct 11 04:09:27 crc kubenswrapper[4703]: I1011 04:09:27.716092 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/b7432b27cbf11f6c9ff99e58a4689d446a54abecf892186881fc53991fcxfcw"] Oct 11 04:09:27 crc kubenswrapper[4703]: E1011 04:09:27.716509 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76b1c0ac-2d3b-4d7b-ab5e-61044e6c92bd" containerName="swift-ring-rebalance" Oct 11 04:09:27 crc kubenswrapper[4703]: I1011 04:09:27.716537 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="76b1c0ac-2d3b-4d7b-ab5e-61044e6c92bd" containerName="swift-ring-rebalance" Oct 11 04:09:27 crc kubenswrapper[4703]: I1011 04:09:27.716771 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="76b1c0ac-2d3b-4d7b-ab5e-61044e6c92bd" containerName="swift-ring-rebalance" Oct 11 04:09:27 crc kubenswrapper[4703]: I1011 04:09:27.718263 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/b7432b27cbf11f6c9ff99e58a4689d446a54abecf892186881fc53991fcxfcw" Oct 11 04:09:27 crc kubenswrapper[4703]: I1011 04:09:27.720940 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-vxrk2" Oct 11 04:09:27 crc kubenswrapper[4703]: I1011 04:09:27.726428 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/b7432b27cbf11f6c9ff99e58a4689d446a54abecf892186881fc53991fcxfcw"] Oct 11 04:09:27 crc kubenswrapper[4703]: I1011 04:09:27.789291 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3b3d5f97-7702-47e8-836c-fd713dbf3070-util\") pod \"b7432b27cbf11f6c9ff99e58a4689d446a54abecf892186881fc53991fcxfcw\" (UID: \"3b3d5f97-7702-47e8-836c-fd713dbf3070\") " pod="openstack-operators/b7432b27cbf11f6c9ff99e58a4689d446a54abecf892186881fc53991fcxfcw" Oct 11 04:09:27 crc kubenswrapper[4703]: I1011 04:09:27.789396 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3b3d5f97-7702-47e8-836c-fd713dbf3070-bundle\") pod \"b7432b27cbf11f6c9ff99e58a4689d446a54abecf892186881fc53991fcxfcw\" (UID: \"3b3d5f97-7702-47e8-836c-fd713dbf3070\") " pod="openstack-operators/b7432b27cbf11f6c9ff99e58a4689d446a54abecf892186881fc53991fcxfcw" Oct 11 04:09:27 crc kubenswrapper[4703]: I1011 04:09:27.789552 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zjhlv\" (UniqueName: \"kubernetes.io/projected/3b3d5f97-7702-47e8-836c-fd713dbf3070-kube-api-access-zjhlv\") pod \"b7432b27cbf11f6c9ff99e58a4689d446a54abecf892186881fc53991fcxfcw\" (UID: \"3b3d5f97-7702-47e8-836c-fd713dbf3070\") " pod="openstack-operators/b7432b27cbf11f6c9ff99e58a4689d446a54abecf892186881fc53991fcxfcw" Oct 11 04:09:27 crc kubenswrapper[4703]: I1011 04:09:27.890854 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3b3d5f97-7702-47e8-836c-fd713dbf3070-util\") pod \"b7432b27cbf11f6c9ff99e58a4689d446a54abecf892186881fc53991fcxfcw\" (UID: \"3b3d5f97-7702-47e8-836c-fd713dbf3070\") " pod="openstack-operators/b7432b27cbf11f6c9ff99e58a4689d446a54abecf892186881fc53991fcxfcw" Oct 11 04:09:27 crc kubenswrapper[4703]: I1011 04:09:27.890966 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3b3d5f97-7702-47e8-836c-fd713dbf3070-bundle\") pod \"b7432b27cbf11f6c9ff99e58a4689d446a54abecf892186881fc53991fcxfcw\" (UID: \"3b3d5f97-7702-47e8-836c-fd713dbf3070\") " pod="openstack-operators/b7432b27cbf11f6c9ff99e58a4689d446a54abecf892186881fc53991fcxfcw" Oct 11 04:09:27 crc kubenswrapper[4703]: I1011 04:09:27.891052 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zjhlv\" (UniqueName: \"kubernetes.io/projected/3b3d5f97-7702-47e8-836c-fd713dbf3070-kube-api-access-zjhlv\") pod \"b7432b27cbf11f6c9ff99e58a4689d446a54abecf892186881fc53991fcxfcw\" (UID: \"3b3d5f97-7702-47e8-836c-fd713dbf3070\") " pod="openstack-operators/b7432b27cbf11f6c9ff99e58a4689d446a54abecf892186881fc53991fcxfcw" Oct 11 04:09:27 crc kubenswrapper[4703]: I1011 04:09:27.891679 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3b3d5f97-7702-47e8-836c-fd713dbf3070-bundle\") pod \"b7432b27cbf11f6c9ff99e58a4689d446a54abecf892186881fc53991fcxfcw\" (UID: \"3b3d5f97-7702-47e8-836c-fd713dbf3070\") " pod="openstack-operators/b7432b27cbf11f6c9ff99e58a4689d446a54abecf892186881fc53991fcxfcw" Oct 11 04:09:27 crc kubenswrapper[4703]: I1011 04:09:27.891811 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3b3d5f97-7702-47e8-836c-fd713dbf3070-util\") pod \"b7432b27cbf11f6c9ff99e58a4689d446a54abecf892186881fc53991fcxfcw\" (UID: \"3b3d5f97-7702-47e8-836c-fd713dbf3070\") " pod="openstack-operators/b7432b27cbf11f6c9ff99e58a4689d446a54abecf892186881fc53991fcxfcw" Oct 11 04:09:27 crc kubenswrapper[4703]: I1011 04:09:27.916132 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zjhlv\" (UniqueName: \"kubernetes.io/projected/3b3d5f97-7702-47e8-836c-fd713dbf3070-kube-api-access-zjhlv\") pod \"b7432b27cbf11f6c9ff99e58a4689d446a54abecf892186881fc53991fcxfcw\" (UID: \"3b3d5f97-7702-47e8-836c-fd713dbf3070\") " pod="openstack-operators/b7432b27cbf11f6c9ff99e58a4689d446a54abecf892186881fc53991fcxfcw" Oct 11 04:09:28 crc kubenswrapper[4703]: I1011 04:09:28.045628 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/b7432b27cbf11f6c9ff99e58a4689d446a54abecf892186881fc53991fcxfcw" Oct 11 04:09:28 crc kubenswrapper[4703]: I1011 04:09:28.494737 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/b7432b27cbf11f6c9ff99e58a4689d446a54abecf892186881fc53991fcxfcw"] Oct 11 04:09:28 crc kubenswrapper[4703]: I1011 04:09:28.597576 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/b7432b27cbf11f6c9ff99e58a4689d446a54abecf892186881fc53991fcxfcw" event={"ID":"3b3d5f97-7702-47e8-836c-fd713dbf3070","Type":"ContainerStarted","Data":"bf11e15837583d7a5da2a47d22b1226854b454cd8bd620097649e9a29ac1c5d1"} Oct 11 04:09:29 crc kubenswrapper[4703]: I1011 04:09:29.605073 4703 generic.go:334] "Generic (PLEG): container finished" podID="3b3d5f97-7702-47e8-836c-fd713dbf3070" containerID="9d4954a81dcb32deb78241544e33cda1958c9ff4826bdd73002a4f5adcc17a9f" exitCode=0 Oct 11 04:09:29 crc kubenswrapper[4703]: I1011 04:09:29.605424 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/b7432b27cbf11f6c9ff99e58a4689d446a54abecf892186881fc53991fcxfcw" event={"ID":"3b3d5f97-7702-47e8-836c-fd713dbf3070","Type":"ContainerDied","Data":"9d4954a81dcb32deb78241544e33cda1958c9ff4826bdd73002a4f5adcc17a9f"} Oct 11 04:09:29 crc kubenswrapper[4703]: I1011 04:09:29.609285 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"edaa5850-8c67-4739-8aa9-b02eff7e3291","Type":"ContainerStarted","Data":"85ef023774e860a611fc53d335ceae99012f777ab83cb7b97d77cdacbadb236c"} Oct 11 04:09:29 crc kubenswrapper[4703]: I1011 04:09:29.609319 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"edaa5850-8c67-4739-8aa9-b02eff7e3291","Type":"ContainerStarted","Data":"48a2b575630d17fdc0ca9a68c17261f942d526bcf605c096f64eddaae3fbd9f7"} Oct 11 04:09:29 crc kubenswrapper[4703]: I1011 04:09:29.609331 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"edaa5850-8c67-4739-8aa9-b02eff7e3291","Type":"ContainerStarted","Data":"1ca6877ff7f216aaa14fc77c6113d8d23ea462a50fc48eec5c903afb7daf7684"} Oct 11 04:09:29 crc kubenswrapper[4703]: I1011 04:09:29.609342 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"edaa5850-8c67-4739-8aa9-b02eff7e3291","Type":"ContainerStarted","Data":"b165ee0b86efef08252d99407740bb28dc7428f0813577dc09d3b7851cfabe0b"} Oct 11 04:09:30 crc kubenswrapper[4703]: I1011 04:09:30.624702 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"edaa5850-8c67-4739-8aa9-b02eff7e3291","Type":"ContainerStarted","Data":"b30c25a5d6616063ed2fd9352ffa137f3364f3f85851e6084e03e4f0c7011443"} Oct 11 04:09:30 crc kubenswrapper[4703]: I1011 04:09:30.628677 4703 generic.go:334] "Generic (PLEG): container finished" podID="3b3d5f97-7702-47e8-836c-fd713dbf3070" containerID="f314771a67fa36494a85459557811f95c13b48d6b4784ca079005e3e966f53a1" exitCode=0 Oct 11 04:09:30 crc kubenswrapper[4703]: I1011 04:09:30.628806 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/b7432b27cbf11f6c9ff99e58a4689d446a54abecf892186881fc53991fcxfcw" event={"ID":"3b3d5f97-7702-47e8-836c-fd713dbf3070","Type":"ContainerDied","Data":"f314771a67fa36494a85459557811f95c13b48d6b4784ca079005e3e966f53a1"} Oct 11 04:09:31 crc kubenswrapper[4703]: I1011 04:09:31.350557 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/e0cb8f8e-afe1-4bc0-af34-bd30cf77e7e6-etc-swift\") pod \"swift-proxy-6dd8f59749-c7s8q\" (UID: \"e0cb8f8e-afe1-4bc0-af34-bd30cf77e7e6\") " pod="glance-kuttl-tests/swift-proxy-6dd8f59749-c7s8q" Oct 11 04:09:31 crc kubenswrapper[4703]: I1011 04:09:31.357877 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/e0cb8f8e-afe1-4bc0-af34-bd30cf77e7e6-etc-swift\") pod \"swift-proxy-6dd8f59749-c7s8q\" (UID: \"e0cb8f8e-afe1-4bc0-af34-bd30cf77e7e6\") " pod="glance-kuttl-tests/swift-proxy-6dd8f59749-c7s8q" Oct 11 04:09:31 crc kubenswrapper[4703]: I1011 04:09:31.461145 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/swift-proxy-6dd8f59749-c7s8q" Oct 11 04:09:31 crc kubenswrapper[4703]: I1011 04:09:31.649076 4703 generic.go:334] "Generic (PLEG): container finished" podID="3b3d5f97-7702-47e8-836c-fd713dbf3070" containerID="3a220575165ab6741bbfc8e115651123810f3d688de5b2059c3c3be48ff579d8" exitCode=0 Oct 11 04:09:31 crc kubenswrapper[4703]: I1011 04:09:31.649208 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/b7432b27cbf11f6c9ff99e58a4689d446a54abecf892186881fc53991fcxfcw" event={"ID":"3b3d5f97-7702-47e8-836c-fd713dbf3070","Type":"ContainerDied","Data":"3a220575165ab6741bbfc8e115651123810f3d688de5b2059c3c3be48ff579d8"} Oct 11 04:09:31 crc kubenswrapper[4703]: I1011 04:09:31.658137 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"edaa5850-8c67-4739-8aa9-b02eff7e3291","Type":"ContainerStarted","Data":"7461aa48b41c81bb30771c2294d213db50415a52d417b29cb4897a8d959825d6"} Oct 11 04:09:31 crc kubenswrapper[4703]: I1011 04:09:31.658396 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"edaa5850-8c67-4739-8aa9-b02eff7e3291","Type":"ContainerStarted","Data":"8c34dbdf8ce0be535bb1dba516b4100a6ff886f88a32bd90e66b572f96ab2620"} Oct 11 04:09:31 crc kubenswrapper[4703]: I1011 04:09:31.658411 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"edaa5850-8c67-4739-8aa9-b02eff7e3291","Type":"ContainerStarted","Data":"eeb8e7e50865766b0c1c757a6d7dd7ae248009d64db09e0ad45323a1c52c1fb7"} Oct 11 04:09:31 crc kubenswrapper[4703]: I1011 04:09:31.658420 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"edaa5850-8c67-4739-8aa9-b02eff7e3291","Type":"ContainerStarted","Data":"fd30be3e7587d25d0d8e177de04a20d5978e9e05409a10522499fee81ef5c947"} Oct 11 04:09:31 crc kubenswrapper[4703]: I1011 04:09:31.658429 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"edaa5850-8c67-4739-8aa9-b02eff7e3291","Type":"ContainerStarted","Data":"22ef5ab062e0bf92ae3417e0c0cb8e6daf36c247c21819d02dfd76c5dbc7f36a"} Oct 11 04:09:31 crc kubenswrapper[4703]: I1011 04:09:31.658439 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"edaa5850-8c67-4739-8aa9-b02eff7e3291","Type":"ContainerStarted","Data":"6a4e3075343b2abb3da1e320e2bba6428d0c662a7ed3a8c074584c3034cd4c02"} Oct 11 04:09:31 crc kubenswrapper[4703]: I1011 04:09:31.711253 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/swift-storage-0" podStartSLOduration=19.872416478 podStartE2EDuration="24.71123458s" podCreationTimestamp="2025-10-11 04:09:07 +0000 UTC" firstStartedPulling="2025-10-11 04:09:25.546337182 +0000 UTC m=+856.756819114" lastFinishedPulling="2025-10-11 04:09:30.385155264 +0000 UTC m=+861.595637216" observedRunningTime="2025-10-11 04:09:31.703719103 +0000 UTC m=+862.914201025" watchObservedRunningTime="2025-10-11 04:09:31.71123458 +0000 UTC m=+862.921716502" Oct 11 04:09:31 crc kubenswrapper[4703]: I1011 04:09:31.941884 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/swift-proxy-6dd8f59749-c7s8q"] Oct 11 04:09:32 crc kubenswrapper[4703]: I1011 04:09:32.668357 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-proxy-6dd8f59749-c7s8q" event={"ID":"e0cb8f8e-afe1-4bc0-af34-bd30cf77e7e6","Type":"ContainerStarted","Data":"a81da9f3eb70600673fb74daeefddbc50197f02b75d04e2c8689fa487f494fde"} Oct 11 04:09:32 crc kubenswrapper[4703]: I1011 04:09:32.668730 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-proxy-6dd8f59749-c7s8q" event={"ID":"e0cb8f8e-afe1-4bc0-af34-bd30cf77e7e6","Type":"ContainerStarted","Data":"9047df001b301d659a5cd935a39098eb953548cb359c813872680015f2720c52"} Oct 11 04:09:32 crc kubenswrapper[4703]: I1011 04:09:32.668753 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-proxy-6dd8f59749-c7s8q" event={"ID":"e0cb8f8e-afe1-4bc0-af34-bd30cf77e7e6","Type":"ContainerStarted","Data":"4af604fbfbf9d09237918ca1b899a52aeccd8668201a3323379bd1d5fd5bc7d0"} Oct 11 04:09:32 crc kubenswrapper[4703]: I1011 04:09:32.668781 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/swift-proxy-6dd8f59749-c7s8q" Oct 11 04:09:32 crc kubenswrapper[4703]: I1011 04:09:32.691941 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/swift-proxy-6dd8f59749-c7s8q" podStartSLOduration=17.691923385 podStartE2EDuration="17.691923385s" podCreationTimestamp="2025-10-11 04:09:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 04:09:32.686117782 +0000 UTC m=+863.896599704" watchObservedRunningTime="2025-10-11 04:09:32.691923385 +0000 UTC m=+863.902405307" Oct 11 04:09:33 crc kubenswrapper[4703]: I1011 04:09:33.011344 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/b7432b27cbf11f6c9ff99e58a4689d446a54abecf892186881fc53991fcxfcw" Oct 11 04:09:33 crc kubenswrapper[4703]: I1011 04:09:33.082639 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zjhlv\" (UniqueName: \"kubernetes.io/projected/3b3d5f97-7702-47e8-836c-fd713dbf3070-kube-api-access-zjhlv\") pod \"3b3d5f97-7702-47e8-836c-fd713dbf3070\" (UID: \"3b3d5f97-7702-47e8-836c-fd713dbf3070\") " Oct 11 04:09:33 crc kubenswrapper[4703]: I1011 04:09:33.082689 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3b3d5f97-7702-47e8-836c-fd713dbf3070-bundle\") pod \"3b3d5f97-7702-47e8-836c-fd713dbf3070\" (UID: \"3b3d5f97-7702-47e8-836c-fd713dbf3070\") " Oct 11 04:09:33 crc kubenswrapper[4703]: I1011 04:09:33.082774 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3b3d5f97-7702-47e8-836c-fd713dbf3070-util\") pod \"3b3d5f97-7702-47e8-836c-fd713dbf3070\" (UID: \"3b3d5f97-7702-47e8-836c-fd713dbf3070\") " Oct 11 04:09:33 crc kubenswrapper[4703]: I1011 04:09:33.083679 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3b3d5f97-7702-47e8-836c-fd713dbf3070-bundle" (OuterVolumeSpecName: "bundle") pod "3b3d5f97-7702-47e8-836c-fd713dbf3070" (UID: "3b3d5f97-7702-47e8-836c-fd713dbf3070"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 04:09:33 crc kubenswrapper[4703]: I1011 04:09:33.090173 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3b3d5f97-7702-47e8-836c-fd713dbf3070-kube-api-access-zjhlv" (OuterVolumeSpecName: "kube-api-access-zjhlv") pod "3b3d5f97-7702-47e8-836c-fd713dbf3070" (UID: "3b3d5f97-7702-47e8-836c-fd713dbf3070"). InnerVolumeSpecName "kube-api-access-zjhlv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 04:09:33 crc kubenswrapper[4703]: I1011 04:09:33.097281 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3b3d5f97-7702-47e8-836c-fd713dbf3070-util" (OuterVolumeSpecName: "util") pod "3b3d5f97-7702-47e8-836c-fd713dbf3070" (UID: "3b3d5f97-7702-47e8-836c-fd713dbf3070"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 04:09:33 crc kubenswrapper[4703]: I1011 04:09:33.185581 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zjhlv\" (UniqueName: \"kubernetes.io/projected/3b3d5f97-7702-47e8-836c-fd713dbf3070-kube-api-access-zjhlv\") on node \"crc\" DevicePath \"\"" Oct 11 04:09:33 crc kubenswrapper[4703]: I1011 04:09:33.185832 4703 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3b3d5f97-7702-47e8-836c-fd713dbf3070-bundle\") on node \"crc\" DevicePath \"\"" Oct 11 04:09:33 crc kubenswrapper[4703]: I1011 04:09:33.185916 4703 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3b3d5f97-7702-47e8-836c-fd713dbf3070-util\") on node \"crc\" DevicePath \"\"" Oct 11 04:09:33 crc kubenswrapper[4703]: I1011 04:09:33.684084 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/b7432b27cbf11f6c9ff99e58a4689d446a54abecf892186881fc53991fcxfcw" Oct 11 04:09:33 crc kubenswrapper[4703]: I1011 04:09:33.684119 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/b7432b27cbf11f6c9ff99e58a4689d446a54abecf892186881fc53991fcxfcw" event={"ID":"3b3d5f97-7702-47e8-836c-fd713dbf3070","Type":"ContainerDied","Data":"bf11e15837583d7a5da2a47d22b1226854b454cd8bd620097649e9a29ac1c5d1"} Oct 11 04:09:33 crc kubenswrapper[4703]: I1011 04:09:33.685151 4703 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bf11e15837583d7a5da2a47d22b1226854b454cd8bd620097649e9a29ac1c5d1" Oct 11 04:09:33 crc kubenswrapper[4703]: I1011 04:09:33.685182 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/swift-proxy-6dd8f59749-c7s8q" Oct 11 04:09:41 crc kubenswrapper[4703]: I1011 04:09:41.464626 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/swift-proxy-6dd8f59749-c7s8q" Oct 11 04:09:41 crc kubenswrapper[4703]: I1011 04:09:41.465222 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/swift-proxy-6dd8f59749-c7s8q" Oct 11 04:09:42 crc kubenswrapper[4703]: I1011 04:09:42.665575 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-7f48cf958-x89d2"] Oct 11 04:09:42 crc kubenswrapper[4703]: E1011 04:09:42.666195 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b3d5f97-7702-47e8-836c-fd713dbf3070" containerName="extract" Oct 11 04:09:42 crc kubenswrapper[4703]: I1011 04:09:42.666209 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b3d5f97-7702-47e8-836c-fd713dbf3070" containerName="extract" Oct 11 04:09:42 crc kubenswrapper[4703]: E1011 04:09:42.666225 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b3d5f97-7702-47e8-836c-fd713dbf3070" containerName="pull" Oct 11 04:09:42 crc kubenswrapper[4703]: I1011 04:09:42.666234 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b3d5f97-7702-47e8-836c-fd713dbf3070" containerName="pull" Oct 11 04:09:42 crc kubenswrapper[4703]: E1011 04:09:42.666252 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b3d5f97-7702-47e8-836c-fd713dbf3070" containerName="util" Oct 11 04:09:42 crc kubenswrapper[4703]: I1011 04:09:42.666260 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b3d5f97-7702-47e8-836c-fd713dbf3070" containerName="util" Oct 11 04:09:42 crc kubenswrapper[4703]: I1011 04:09:42.666423 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b3d5f97-7702-47e8-836c-fd713dbf3070" containerName="extract" Oct 11 04:09:42 crc kubenswrapper[4703]: I1011 04:09:42.667248 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-7f48cf958-x89d2" Oct 11 04:09:42 crc kubenswrapper[4703]: I1011 04:09:42.669174 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-service-cert" Oct 11 04:09:42 crc kubenswrapper[4703]: I1011 04:09:42.669183 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-2r55k" Oct 11 04:09:42 crc kubenswrapper[4703]: I1011 04:09:42.681252 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-7f48cf958-x89d2"] Oct 11 04:09:42 crc kubenswrapper[4703]: I1011 04:09:42.832644 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/766465e2-cc0d-40f4-97cb-d79d92dec7eb-apiservice-cert\") pod \"glance-operator-controller-manager-7f48cf958-x89d2\" (UID: \"766465e2-cc0d-40f4-97cb-d79d92dec7eb\") " pod="openstack-operators/glance-operator-controller-manager-7f48cf958-x89d2" Oct 11 04:09:42 crc kubenswrapper[4703]: I1011 04:09:42.832716 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-87d98\" (UniqueName: \"kubernetes.io/projected/766465e2-cc0d-40f4-97cb-d79d92dec7eb-kube-api-access-87d98\") pod \"glance-operator-controller-manager-7f48cf958-x89d2\" (UID: \"766465e2-cc0d-40f4-97cb-d79d92dec7eb\") " pod="openstack-operators/glance-operator-controller-manager-7f48cf958-x89d2" Oct 11 04:09:42 crc kubenswrapper[4703]: I1011 04:09:42.832793 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/766465e2-cc0d-40f4-97cb-d79d92dec7eb-webhook-cert\") pod \"glance-operator-controller-manager-7f48cf958-x89d2\" (UID: \"766465e2-cc0d-40f4-97cb-d79d92dec7eb\") " pod="openstack-operators/glance-operator-controller-manager-7f48cf958-x89d2" Oct 11 04:09:42 crc kubenswrapper[4703]: I1011 04:09:42.934276 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/766465e2-cc0d-40f4-97cb-d79d92dec7eb-webhook-cert\") pod \"glance-operator-controller-manager-7f48cf958-x89d2\" (UID: \"766465e2-cc0d-40f4-97cb-d79d92dec7eb\") " pod="openstack-operators/glance-operator-controller-manager-7f48cf958-x89d2" Oct 11 04:09:42 crc kubenswrapper[4703]: I1011 04:09:42.934362 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/766465e2-cc0d-40f4-97cb-d79d92dec7eb-apiservice-cert\") pod \"glance-operator-controller-manager-7f48cf958-x89d2\" (UID: \"766465e2-cc0d-40f4-97cb-d79d92dec7eb\") " pod="openstack-operators/glance-operator-controller-manager-7f48cf958-x89d2" Oct 11 04:09:42 crc kubenswrapper[4703]: I1011 04:09:42.934433 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-87d98\" (UniqueName: \"kubernetes.io/projected/766465e2-cc0d-40f4-97cb-d79d92dec7eb-kube-api-access-87d98\") pod \"glance-operator-controller-manager-7f48cf958-x89d2\" (UID: \"766465e2-cc0d-40f4-97cb-d79d92dec7eb\") " pod="openstack-operators/glance-operator-controller-manager-7f48cf958-x89d2" Oct 11 04:09:42 crc kubenswrapper[4703]: I1011 04:09:42.941355 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/766465e2-cc0d-40f4-97cb-d79d92dec7eb-webhook-cert\") pod \"glance-operator-controller-manager-7f48cf958-x89d2\" (UID: \"766465e2-cc0d-40f4-97cb-d79d92dec7eb\") " pod="openstack-operators/glance-operator-controller-manager-7f48cf958-x89d2" Oct 11 04:09:42 crc kubenswrapper[4703]: I1011 04:09:42.941380 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/766465e2-cc0d-40f4-97cb-d79d92dec7eb-apiservice-cert\") pod \"glance-operator-controller-manager-7f48cf958-x89d2\" (UID: \"766465e2-cc0d-40f4-97cb-d79d92dec7eb\") " pod="openstack-operators/glance-operator-controller-manager-7f48cf958-x89d2" Oct 11 04:09:42 crc kubenswrapper[4703]: I1011 04:09:42.954198 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-87d98\" (UniqueName: \"kubernetes.io/projected/766465e2-cc0d-40f4-97cb-d79d92dec7eb-kube-api-access-87d98\") pod \"glance-operator-controller-manager-7f48cf958-x89d2\" (UID: \"766465e2-cc0d-40f4-97cb-d79d92dec7eb\") " pod="openstack-operators/glance-operator-controller-manager-7f48cf958-x89d2" Oct 11 04:09:43 crc kubenswrapper[4703]: I1011 04:09:43.015521 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-7f48cf958-x89d2" Oct 11 04:09:43 crc kubenswrapper[4703]: I1011 04:09:43.532390 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-7f48cf958-x89d2"] Oct 11 04:09:43 crc kubenswrapper[4703]: I1011 04:09:43.749184 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-7f48cf958-x89d2" event={"ID":"766465e2-cc0d-40f4-97cb-d79d92dec7eb","Type":"ContainerStarted","Data":"1f33d0437a4fc392aa88b666db36e289e7f15062944ad71022b01e0c3351527d"} Oct 11 04:09:46 crc kubenswrapper[4703]: I1011 04:09:46.784026 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-7f48cf958-x89d2" event={"ID":"766465e2-cc0d-40f4-97cb-d79d92dec7eb","Type":"ContainerStarted","Data":"f3dc129050688dc2042bf8450e86e88424cebeb3c947f1147f1f7697ac0a51be"} Oct 11 04:09:47 crc kubenswrapper[4703]: I1011 04:09:47.795560 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-7f48cf958-x89d2" event={"ID":"766465e2-cc0d-40f4-97cb-d79d92dec7eb","Type":"ContainerStarted","Data":"35a6d1dace4d076ec285e2c4925409807c589af6d6ebed884c6e45b40408fdcf"} Oct 11 04:09:47 crc kubenswrapper[4703]: I1011 04:09:47.795958 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-7f48cf958-x89d2" Oct 11 04:09:47 crc kubenswrapper[4703]: I1011 04:09:47.826958 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-7f48cf958-x89d2" podStartSLOduration=2.693959257 podStartE2EDuration="5.826937096s" podCreationTimestamp="2025-10-11 04:09:42 +0000 UTC" firstStartedPulling="2025-10-11 04:09:43.55857948 +0000 UTC m=+874.769061402" lastFinishedPulling="2025-10-11 04:09:46.691557279 +0000 UTC m=+877.902039241" observedRunningTime="2025-10-11 04:09:47.819285075 +0000 UTC m=+879.029767017" watchObservedRunningTime="2025-10-11 04:09:47.826937096 +0000 UTC m=+879.037419028" Oct 11 04:09:53 crc kubenswrapper[4703]: I1011 04:09:53.022095 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-7f48cf958-x89d2" Oct 11 04:09:57 crc kubenswrapper[4703]: I1011 04:09:57.870268 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-db-create-87jj2"] Oct 11 04:09:57 crc kubenswrapper[4703]: I1011 04:09:57.871970 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-create-87jj2" Oct 11 04:09:57 crc kubenswrapper[4703]: I1011 04:09:57.876452 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/openstackclient"] Oct 11 04:09:57 crc kubenswrapper[4703]: I1011 04:09:57.878273 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/openstackclient" Oct 11 04:09:57 crc kubenswrapper[4703]: I1011 04:09:57.880991 4703 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"openstack-config-secret" Oct 11 04:09:57 crc kubenswrapper[4703]: I1011 04:09:57.881167 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"glance-kuttl-tests"/"openstack-config" Oct 11 04:09:57 crc kubenswrapper[4703]: I1011 04:09:57.883871 4703 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"default-dockercfg-lrfdz" Oct 11 04:09:57 crc kubenswrapper[4703]: I1011 04:09:57.884064 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"glance-kuttl-tests"/"openstack-scripts-9db6gc427h" Oct 11 04:09:57 crc kubenswrapper[4703]: I1011 04:09:57.886151 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-create-87jj2"] Oct 11 04:09:57 crc kubenswrapper[4703]: I1011 04:09:57.891518 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/openstackclient"] Oct 11 04:09:57 crc kubenswrapper[4703]: I1011 04:09:57.979395 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-scripts\" (UniqueName: \"kubernetes.io/configmap/d8f5e879-8b5b-45a9-96d5-12c5753b8d6b-openstack-scripts\") pod \"openstackclient\" (UID: \"d8f5e879-8b5b-45a9-96d5-12c5753b8d6b\") " pod="glance-kuttl-tests/openstackclient" Oct 11 04:09:57 crc kubenswrapper[4703]: I1011 04:09:57.979642 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/d8f5e879-8b5b-45a9-96d5-12c5753b8d6b-openstack-config\") pod \"openstackclient\" (UID: \"d8f5e879-8b5b-45a9-96d5-12c5753b8d6b\") " pod="glance-kuttl-tests/openstackclient" Oct 11 04:09:57 crc kubenswrapper[4703]: I1011 04:09:57.979761 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dbmn8\" (UniqueName: \"kubernetes.io/projected/e404d3d1-1536-4de8-b8a2-e13e7d61328b-kube-api-access-dbmn8\") pod \"glance-db-create-87jj2\" (UID: \"e404d3d1-1536-4de8-b8a2-e13e7d61328b\") " pod="glance-kuttl-tests/glance-db-create-87jj2" Oct 11 04:09:57 crc kubenswrapper[4703]: I1011 04:09:57.979800 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-flpmj\" (UniqueName: \"kubernetes.io/projected/d8f5e879-8b5b-45a9-96d5-12c5753b8d6b-kube-api-access-flpmj\") pod \"openstackclient\" (UID: \"d8f5e879-8b5b-45a9-96d5-12c5753b8d6b\") " pod="glance-kuttl-tests/openstackclient" Oct 11 04:09:57 crc kubenswrapper[4703]: I1011 04:09:57.979837 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/d8f5e879-8b5b-45a9-96d5-12c5753b8d6b-openstack-config-secret\") pod \"openstackclient\" (UID: \"d8f5e879-8b5b-45a9-96d5-12c5753b8d6b\") " pod="glance-kuttl-tests/openstackclient" Oct 11 04:09:58 crc kubenswrapper[4703]: I1011 04:09:58.081192 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/d8f5e879-8b5b-45a9-96d5-12c5753b8d6b-openstack-config\") pod \"openstackclient\" (UID: \"d8f5e879-8b5b-45a9-96d5-12c5753b8d6b\") " pod="glance-kuttl-tests/openstackclient" Oct 11 04:09:58 crc kubenswrapper[4703]: I1011 04:09:58.081484 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dbmn8\" (UniqueName: \"kubernetes.io/projected/e404d3d1-1536-4de8-b8a2-e13e7d61328b-kube-api-access-dbmn8\") pod \"glance-db-create-87jj2\" (UID: \"e404d3d1-1536-4de8-b8a2-e13e7d61328b\") " pod="glance-kuttl-tests/glance-db-create-87jj2" Oct 11 04:09:58 crc kubenswrapper[4703]: I1011 04:09:58.081507 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-flpmj\" (UniqueName: \"kubernetes.io/projected/d8f5e879-8b5b-45a9-96d5-12c5753b8d6b-kube-api-access-flpmj\") pod \"openstackclient\" (UID: \"d8f5e879-8b5b-45a9-96d5-12c5753b8d6b\") " pod="glance-kuttl-tests/openstackclient" Oct 11 04:09:58 crc kubenswrapper[4703]: I1011 04:09:58.081527 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/d8f5e879-8b5b-45a9-96d5-12c5753b8d6b-openstack-config-secret\") pod \"openstackclient\" (UID: \"d8f5e879-8b5b-45a9-96d5-12c5753b8d6b\") " pod="glance-kuttl-tests/openstackclient" Oct 11 04:09:58 crc kubenswrapper[4703]: I1011 04:09:58.081544 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-scripts\" (UniqueName: \"kubernetes.io/configmap/d8f5e879-8b5b-45a9-96d5-12c5753b8d6b-openstack-scripts\") pod \"openstackclient\" (UID: \"d8f5e879-8b5b-45a9-96d5-12c5753b8d6b\") " pod="glance-kuttl-tests/openstackclient" Oct 11 04:09:58 crc kubenswrapper[4703]: I1011 04:09:58.082321 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-scripts\" (UniqueName: \"kubernetes.io/configmap/d8f5e879-8b5b-45a9-96d5-12c5753b8d6b-openstack-scripts\") pod \"openstackclient\" (UID: \"d8f5e879-8b5b-45a9-96d5-12c5753b8d6b\") " pod="glance-kuttl-tests/openstackclient" Oct 11 04:09:58 crc kubenswrapper[4703]: I1011 04:09:58.082323 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/d8f5e879-8b5b-45a9-96d5-12c5753b8d6b-openstack-config\") pod \"openstackclient\" (UID: \"d8f5e879-8b5b-45a9-96d5-12c5753b8d6b\") " pod="glance-kuttl-tests/openstackclient" Oct 11 04:09:58 crc kubenswrapper[4703]: I1011 04:09:58.100560 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/d8f5e879-8b5b-45a9-96d5-12c5753b8d6b-openstack-config-secret\") pod \"openstackclient\" (UID: \"d8f5e879-8b5b-45a9-96d5-12c5753b8d6b\") " pod="glance-kuttl-tests/openstackclient" Oct 11 04:09:58 crc kubenswrapper[4703]: I1011 04:09:58.100627 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-flpmj\" (UniqueName: \"kubernetes.io/projected/d8f5e879-8b5b-45a9-96d5-12c5753b8d6b-kube-api-access-flpmj\") pod \"openstackclient\" (UID: \"d8f5e879-8b5b-45a9-96d5-12c5753b8d6b\") " pod="glance-kuttl-tests/openstackclient" Oct 11 04:09:58 crc kubenswrapper[4703]: I1011 04:09:58.104935 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dbmn8\" (UniqueName: \"kubernetes.io/projected/e404d3d1-1536-4de8-b8a2-e13e7d61328b-kube-api-access-dbmn8\") pod \"glance-db-create-87jj2\" (UID: \"e404d3d1-1536-4de8-b8a2-e13e7d61328b\") " pod="glance-kuttl-tests/glance-db-create-87jj2" Oct 11 04:09:58 crc kubenswrapper[4703]: I1011 04:09:58.191343 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-create-87jj2" Oct 11 04:09:58 crc kubenswrapper[4703]: I1011 04:09:58.195008 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/openstackclient" Oct 11 04:09:58 crc kubenswrapper[4703]: I1011 04:09:58.649875 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-create-87jj2"] Oct 11 04:09:58 crc kubenswrapper[4703]: W1011 04:09:58.656177 4703 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode404d3d1_1536_4de8_b8a2_e13e7d61328b.slice/crio-98154ecc524c84fa4db5c14e5cd8dbdc4fb823f41b30322e5a42b3e433538c1a WatchSource:0}: Error finding container 98154ecc524c84fa4db5c14e5cd8dbdc4fb823f41b30322e5a42b3e433538c1a: Status 404 returned error can't find the container with id 98154ecc524c84fa4db5c14e5cd8dbdc4fb823f41b30322e5a42b3e433538c1a Oct 11 04:09:58 crc kubenswrapper[4703]: I1011 04:09:58.702022 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/openstackclient"] Oct 11 04:09:58 crc kubenswrapper[4703]: W1011 04:09:58.707702 4703 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd8f5e879_8b5b_45a9_96d5_12c5753b8d6b.slice/crio-326f43039dfc5653034cc538019f1d7f2c776984b71403b3f4b674b53b1543aa WatchSource:0}: Error finding container 326f43039dfc5653034cc538019f1d7f2c776984b71403b3f4b674b53b1543aa: Status 404 returned error can't find the container with id 326f43039dfc5653034cc538019f1d7f2c776984b71403b3f4b674b53b1543aa Oct 11 04:09:58 crc kubenswrapper[4703]: I1011 04:09:58.921478 4703 generic.go:334] "Generic (PLEG): container finished" podID="e404d3d1-1536-4de8-b8a2-e13e7d61328b" containerID="2d59bece9c709dc22b4670beec1a99c5c70b2d85849c636775fc658cf5f5b39a" exitCode=0 Oct 11 04:09:58 crc kubenswrapper[4703]: I1011 04:09:58.921941 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-create-87jj2" event={"ID":"e404d3d1-1536-4de8-b8a2-e13e7d61328b","Type":"ContainerDied","Data":"2d59bece9c709dc22b4670beec1a99c5c70b2d85849c636775fc658cf5f5b39a"} Oct 11 04:09:58 crc kubenswrapper[4703]: I1011 04:09:58.921973 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-create-87jj2" event={"ID":"e404d3d1-1536-4de8-b8a2-e13e7d61328b","Type":"ContainerStarted","Data":"98154ecc524c84fa4db5c14e5cd8dbdc4fb823f41b30322e5a42b3e433538c1a"} Oct 11 04:09:58 crc kubenswrapper[4703]: I1011 04:09:58.923252 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstackclient" event={"ID":"d8f5e879-8b5b-45a9-96d5-12c5753b8d6b","Type":"ContainerStarted","Data":"326f43039dfc5653034cc538019f1d7f2c776984b71403b3f4b674b53b1543aa"} Oct 11 04:10:00 crc kubenswrapper[4703]: I1011 04:10:00.203508 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-create-87jj2" Oct 11 04:10:00 crc kubenswrapper[4703]: I1011 04:10:00.314781 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbmn8\" (UniqueName: \"kubernetes.io/projected/e404d3d1-1536-4de8-b8a2-e13e7d61328b-kube-api-access-dbmn8\") pod \"e404d3d1-1536-4de8-b8a2-e13e7d61328b\" (UID: \"e404d3d1-1536-4de8-b8a2-e13e7d61328b\") " Oct 11 04:10:00 crc kubenswrapper[4703]: I1011 04:10:00.326884 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e404d3d1-1536-4de8-b8a2-e13e7d61328b-kube-api-access-dbmn8" (OuterVolumeSpecName: "kube-api-access-dbmn8") pod "e404d3d1-1536-4de8-b8a2-e13e7d61328b" (UID: "e404d3d1-1536-4de8-b8a2-e13e7d61328b"). InnerVolumeSpecName "kube-api-access-dbmn8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 04:10:00 crc kubenswrapper[4703]: I1011 04:10:00.416195 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbmn8\" (UniqueName: \"kubernetes.io/projected/e404d3d1-1536-4de8-b8a2-e13e7d61328b-kube-api-access-dbmn8\") on node \"crc\" DevicePath \"\"" Oct 11 04:10:00 crc kubenswrapper[4703]: I1011 04:10:00.941020 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-create-87jj2" event={"ID":"e404d3d1-1536-4de8-b8a2-e13e7d61328b","Type":"ContainerDied","Data":"98154ecc524c84fa4db5c14e5cd8dbdc4fb823f41b30322e5a42b3e433538c1a"} Oct 11 04:10:00 crc kubenswrapper[4703]: I1011 04:10:00.941305 4703 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="98154ecc524c84fa4db5c14e5cd8dbdc4fb823f41b30322e5a42b3e433538c1a" Oct 11 04:10:00 crc kubenswrapper[4703]: I1011 04:10:00.941072 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-create-87jj2" Oct 11 04:10:07 crc kubenswrapper[4703]: I1011 04:10:07.880851 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-3490-account-create-4pwmz"] Oct 11 04:10:07 crc kubenswrapper[4703]: E1011 04:10:07.881692 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e404d3d1-1536-4de8-b8a2-e13e7d61328b" containerName="mariadb-database-create" Oct 11 04:10:07 crc kubenswrapper[4703]: I1011 04:10:07.881711 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="e404d3d1-1536-4de8-b8a2-e13e7d61328b" containerName="mariadb-database-create" Oct 11 04:10:07 crc kubenswrapper[4703]: I1011 04:10:07.881857 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="e404d3d1-1536-4de8-b8a2-e13e7d61328b" containerName="mariadb-database-create" Oct 11 04:10:07 crc kubenswrapper[4703]: I1011 04:10:07.882249 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-3490-account-create-4pwmz" Oct 11 04:10:07 crc kubenswrapper[4703]: I1011 04:10:07.884379 4703 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-db-secret" Oct 11 04:10:07 crc kubenswrapper[4703]: I1011 04:10:07.892401 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-3490-account-create-4pwmz"] Oct 11 04:10:08 crc kubenswrapper[4703]: I1011 04:10:08.006953 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstackclient" event={"ID":"d8f5e879-8b5b-45a9-96d5-12c5753b8d6b","Type":"ContainerStarted","Data":"4149f59704f0bba419fbf1e746c27e33c6257421aa50090ad4f6fd26b7161163"} Oct 11 04:10:08 crc kubenswrapper[4703]: I1011 04:10:08.041649 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/openstackclient" podStartSLOduration=2.105132374 podStartE2EDuration="11.041628511s" podCreationTimestamp="2025-10-11 04:09:57 +0000 UTC" firstStartedPulling="2025-10-11 04:09:58.710501436 +0000 UTC m=+889.920983358" lastFinishedPulling="2025-10-11 04:10:07.646997573 +0000 UTC m=+898.857479495" observedRunningTime="2025-10-11 04:10:08.0229824 +0000 UTC m=+899.233464352" watchObservedRunningTime="2025-10-11 04:10:08.041628511 +0000 UTC m=+899.252110433" Oct 11 04:10:08 crc kubenswrapper[4703]: I1011 04:10:08.044925 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xcw5z\" (UniqueName: \"kubernetes.io/projected/0eb416e8-1024-41ef-9551-d2635269ca21-kube-api-access-xcw5z\") pod \"glance-3490-account-create-4pwmz\" (UID: \"0eb416e8-1024-41ef-9551-d2635269ca21\") " pod="glance-kuttl-tests/glance-3490-account-create-4pwmz" Oct 11 04:10:08 crc kubenswrapper[4703]: I1011 04:10:08.146212 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xcw5z\" (UniqueName: \"kubernetes.io/projected/0eb416e8-1024-41ef-9551-d2635269ca21-kube-api-access-xcw5z\") pod \"glance-3490-account-create-4pwmz\" (UID: \"0eb416e8-1024-41ef-9551-d2635269ca21\") " pod="glance-kuttl-tests/glance-3490-account-create-4pwmz" Oct 11 04:10:08 crc kubenswrapper[4703]: I1011 04:10:08.163786 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xcw5z\" (UniqueName: \"kubernetes.io/projected/0eb416e8-1024-41ef-9551-d2635269ca21-kube-api-access-xcw5z\") pod \"glance-3490-account-create-4pwmz\" (UID: \"0eb416e8-1024-41ef-9551-d2635269ca21\") " pod="glance-kuttl-tests/glance-3490-account-create-4pwmz" Oct 11 04:10:08 crc kubenswrapper[4703]: I1011 04:10:08.223888 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-3490-account-create-4pwmz" Oct 11 04:10:08 crc kubenswrapper[4703]: I1011 04:10:08.536063 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-3490-account-create-4pwmz"] Oct 11 04:10:08 crc kubenswrapper[4703]: W1011 04:10:08.540902 4703 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0eb416e8_1024_41ef_9551_d2635269ca21.slice/crio-83604336954b7b6f079ca5af28c57599199bc77217eb5178b030f72de6d3689b WatchSource:0}: Error finding container 83604336954b7b6f079ca5af28c57599199bc77217eb5178b030f72de6d3689b: Status 404 returned error can't find the container with id 83604336954b7b6f079ca5af28c57599199bc77217eb5178b030f72de6d3689b Oct 11 04:10:09 crc kubenswrapper[4703]: I1011 04:10:09.021152 4703 generic.go:334] "Generic (PLEG): container finished" podID="0eb416e8-1024-41ef-9551-d2635269ca21" containerID="d7357289a1651775c7d6e340068a89380869665d9f1fa747a80d6e20c964521c" exitCode=0 Oct 11 04:10:09 crc kubenswrapper[4703]: I1011 04:10:09.021242 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-3490-account-create-4pwmz" event={"ID":"0eb416e8-1024-41ef-9551-d2635269ca21","Type":"ContainerDied","Data":"d7357289a1651775c7d6e340068a89380869665d9f1fa747a80d6e20c964521c"} Oct 11 04:10:09 crc kubenswrapper[4703]: I1011 04:10:09.021565 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-3490-account-create-4pwmz" event={"ID":"0eb416e8-1024-41ef-9551-d2635269ca21","Type":"ContainerStarted","Data":"83604336954b7b6f079ca5af28c57599199bc77217eb5178b030f72de6d3689b"} Oct 11 04:10:10 crc kubenswrapper[4703]: I1011 04:10:10.333826 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-3490-account-create-4pwmz" Oct 11 04:10:10 crc kubenswrapper[4703]: I1011 04:10:10.481904 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcw5z\" (UniqueName: \"kubernetes.io/projected/0eb416e8-1024-41ef-9551-d2635269ca21-kube-api-access-xcw5z\") pod \"0eb416e8-1024-41ef-9551-d2635269ca21\" (UID: \"0eb416e8-1024-41ef-9551-d2635269ca21\") " Oct 11 04:10:10 crc kubenswrapper[4703]: I1011 04:10:10.488189 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0eb416e8-1024-41ef-9551-d2635269ca21-kube-api-access-xcw5z" (OuterVolumeSpecName: "kube-api-access-xcw5z") pod "0eb416e8-1024-41ef-9551-d2635269ca21" (UID: "0eb416e8-1024-41ef-9551-d2635269ca21"). InnerVolumeSpecName "kube-api-access-xcw5z". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 04:10:10 crc kubenswrapper[4703]: I1011 04:10:10.584490 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcw5z\" (UniqueName: \"kubernetes.io/projected/0eb416e8-1024-41ef-9551-d2635269ca21-kube-api-access-xcw5z\") on node \"crc\" DevicePath \"\"" Oct 11 04:10:11 crc kubenswrapper[4703]: I1011 04:10:11.042387 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-3490-account-create-4pwmz" event={"ID":"0eb416e8-1024-41ef-9551-d2635269ca21","Type":"ContainerDied","Data":"83604336954b7b6f079ca5af28c57599199bc77217eb5178b030f72de6d3689b"} Oct 11 04:10:11 crc kubenswrapper[4703]: I1011 04:10:11.042444 4703 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="83604336954b7b6f079ca5af28c57599199bc77217eb5178b030f72de6d3689b" Oct 11 04:10:11 crc kubenswrapper[4703]: I1011 04:10:11.042511 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-3490-account-create-4pwmz" Oct 11 04:10:13 crc kubenswrapper[4703]: I1011 04:10:13.010220 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-db-sync-6z8lk"] Oct 11 04:10:13 crc kubenswrapper[4703]: E1011 04:10:13.010814 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0eb416e8-1024-41ef-9551-d2635269ca21" containerName="mariadb-account-create" Oct 11 04:10:13 crc kubenswrapper[4703]: I1011 04:10:13.010829 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="0eb416e8-1024-41ef-9551-d2635269ca21" containerName="mariadb-account-create" Oct 11 04:10:13 crc kubenswrapper[4703]: I1011 04:10:13.011009 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="0eb416e8-1024-41ef-9551-d2635269ca21" containerName="mariadb-account-create" Oct 11 04:10:13 crc kubenswrapper[4703]: I1011 04:10:13.011671 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-6z8lk" Oct 11 04:10:13 crc kubenswrapper[4703]: I1011 04:10:13.014575 4703 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-glance-dockercfg-7dkkv" Oct 11 04:10:13 crc kubenswrapper[4703]: I1011 04:10:13.016902 4703 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-config-data" Oct 11 04:10:13 crc kubenswrapper[4703]: I1011 04:10:13.020339 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-sync-6z8lk"] Oct 11 04:10:13 crc kubenswrapper[4703]: I1011 04:10:13.020568 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gjt54\" (UniqueName: \"kubernetes.io/projected/d4dbd298-477d-4565-8e77-f5a1735560f6-kube-api-access-gjt54\") pod \"glance-db-sync-6z8lk\" (UID: \"d4dbd298-477d-4565-8e77-f5a1735560f6\") " pod="glance-kuttl-tests/glance-db-sync-6z8lk" Oct 11 04:10:13 crc kubenswrapper[4703]: I1011 04:10:13.020763 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d4dbd298-477d-4565-8e77-f5a1735560f6-db-sync-config-data\") pod \"glance-db-sync-6z8lk\" (UID: \"d4dbd298-477d-4565-8e77-f5a1735560f6\") " pod="glance-kuttl-tests/glance-db-sync-6z8lk" Oct 11 04:10:13 crc kubenswrapper[4703]: I1011 04:10:13.020869 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4dbd298-477d-4565-8e77-f5a1735560f6-config-data\") pod \"glance-db-sync-6z8lk\" (UID: \"d4dbd298-477d-4565-8e77-f5a1735560f6\") " pod="glance-kuttl-tests/glance-db-sync-6z8lk" Oct 11 04:10:13 crc kubenswrapper[4703]: I1011 04:10:13.121510 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d4dbd298-477d-4565-8e77-f5a1735560f6-db-sync-config-data\") pod \"glance-db-sync-6z8lk\" (UID: \"d4dbd298-477d-4565-8e77-f5a1735560f6\") " pod="glance-kuttl-tests/glance-db-sync-6z8lk" Oct 11 04:10:13 crc kubenswrapper[4703]: I1011 04:10:13.121764 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4dbd298-477d-4565-8e77-f5a1735560f6-config-data\") pod \"glance-db-sync-6z8lk\" (UID: \"d4dbd298-477d-4565-8e77-f5a1735560f6\") " pod="glance-kuttl-tests/glance-db-sync-6z8lk" Oct 11 04:10:13 crc kubenswrapper[4703]: I1011 04:10:13.121829 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gjt54\" (UniqueName: \"kubernetes.io/projected/d4dbd298-477d-4565-8e77-f5a1735560f6-kube-api-access-gjt54\") pod \"glance-db-sync-6z8lk\" (UID: \"d4dbd298-477d-4565-8e77-f5a1735560f6\") " pod="glance-kuttl-tests/glance-db-sync-6z8lk" Oct 11 04:10:13 crc kubenswrapper[4703]: I1011 04:10:13.127389 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d4dbd298-477d-4565-8e77-f5a1735560f6-db-sync-config-data\") pod \"glance-db-sync-6z8lk\" (UID: \"d4dbd298-477d-4565-8e77-f5a1735560f6\") " pod="glance-kuttl-tests/glance-db-sync-6z8lk" Oct 11 04:10:13 crc kubenswrapper[4703]: I1011 04:10:13.127644 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4dbd298-477d-4565-8e77-f5a1735560f6-config-data\") pod \"glance-db-sync-6z8lk\" (UID: \"d4dbd298-477d-4565-8e77-f5a1735560f6\") " pod="glance-kuttl-tests/glance-db-sync-6z8lk" Oct 11 04:10:13 crc kubenswrapper[4703]: I1011 04:10:13.150739 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gjt54\" (UniqueName: \"kubernetes.io/projected/d4dbd298-477d-4565-8e77-f5a1735560f6-kube-api-access-gjt54\") pod \"glance-db-sync-6z8lk\" (UID: \"d4dbd298-477d-4565-8e77-f5a1735560f6\") " pod="glance-kuttl-tests/glance-db-sync-6z8lk" Oct 11 04:10:13 crc kubenswrapper[4703]: I1011 04:10:13.337220 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-6z8lk" Oct 11 04:10:13 crc kubenswrapper[4703]: I1011 04:10:13.802067 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-sync-6z8lk"] Oct 11 04:10:13 crc kubenswrapper[4703]: W1011 04:10:13.808267 4703 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd4dbd298_477d_4565_8e77_f5a1735560f6.slice/crio-070beec757cca0eee7a78b352e21be28d50393e81c2d68eb7eb5717e8be4c19e WatchSource:0}: Error finding container 070beec757cca0eee7a78b352e21be28d50393e81c2d68eb7eb5717e8be4c19e: Status 404 returned error can't find the container with id 070beec757cca0eee7a78b352e21be28d50393e81c2d68eb7eb5717e8be4c19e Oct 11 04:10:14 crc kubenswrapper[4703]: I1011 04:10:14.060270 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-6z8lk" event={"ID":"d4dbd298-477d-4565-8e77-f5a1735560f6","Type":"ContainerStarted","Data":"070beec757cca0eee7a78b352e21be28d50393e81c2d68eb7eb5717e8be4c19e"} Oct 11 04:10:27 crc kubenswrapper[4703]: I1011 04:10:27.200339 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-6z8lk" event={"ID":"d4dbd298-477d-4565-8e77-f5a1735560f6","Type":"ContainerStarted","Data":"4d62f6315776b366f8e29151b144f33260ed637d95896f626009e82d09aae0dd"} Oct 11 04:10:27 crc kubenswrapper[4703]: I1011 04:10:27.225320 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-db-sync-6z8lk" podStartSLOduration=3.121586498 podStartE2EDuration="15.225264845s" podCreationTimestamp="2025-10-11 04:10:12 +0000 UTC" firstStartedPulling="2025-10-11 04:10:13.810517766 +0000 UTC m=+905.020999688" lastFinishedPulling="2025-10-11 04:10:25.914196093 +0000 UTC m=+917.124678035" observedRunningTime="2025-10-11 04:10:27.214737407 +0000 UTC m=+918.425219359" watchObservedRunningTime="2025-10-11 04:10:27.225264845 +0000 UTC m=+918.435746787" Oct 11 04:10:33 crc kubenswrapper[4703]: I1011 04:10:33.259255 4703 generic.go:334] "Generic (PLEG): container finished" podID="d4dbd298-477d-4565-8e77-f5a1735560f6" containerID="4d62f6315776b366f8e29151b144f33260ed637d95896f626009e82d09aae0dd" exitCode=0 Oct 11 04:10:33 crc kubenswrapper[4703]: I1011 04:10:33.259318 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-6z8lk" event={"ID":"d4dbd298-477d-4565-8e77-f5a1735560f6","Type":"ContainerDied","Data":"4d62f6315776b366f8e29151b144f33260ed637d95896f626009e82d09aae0dd"} Oct 11 04:10:34 crc kubenswrapper[4703]: I1011 04:10:34.624385 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-6z8lk" Oct 11 04:10:34 crc kubenswrapper[4703]: I1011 04:10:34.740267 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4dbd298-477d-4565-8e77-f5a1735560f6-config-data\") pod \"d4dbd298-477d-4565-8e77-f5a1735560f6\" (UID: \"d4dbd298-477d-4565-8e77-f5a1735560f6\") " Oct 11 04:10:34 crc kubenswrapper[4703]: I1011 04:10:34.740412 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d4dbd298-477d-4565-8e77-f5a1735560f6-db-sync-config-data\") pod \"d4dbd298-477d-4565-8e77-f5a1735560f6\" (UID: \"d4dbd298-477d-4565-8e77-f5a1735560f6\") " Oct 11 04:10:34 crc kubenswrapper[4703]: I1011 04:10:34.740703 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gjt54\" (UniqueName: \"kubernetes.io/projected/d4dbd298-477d-4565-8e77-f5a1735560f6-kube-api-access-gjt54\") pod \"d4dbd298-477d-4565-8e77-f5a1735560f6\" (UID: \"d4dbd298-477d-4565-8e77-f5a1735560f6\") " Oct 11 04:10:34 crc kubenswrapper[4703]: I1011 04:10:34.747049 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d4dbd298-477d-4565-8e77-f5a1735560f6-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "d4dbd298-477d-4565-8e77-f5a1735560f6" (UID: "d4dbd298-477d-4565-8e77-f5a1735560f6"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 04:10:34 crc kubenswrapper[4703]: I1011 04:10:34.752737 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d4dbd298-477d-4565-8e77-f5a1735560f6-kube-api-access-gjt54" (OuterVolumeSpecName: "kube-api-access-gjt54") pod "d4dbd298-477d-4565-8e77-f5a1735560f6" (UID: "d4dbd298-477d-4565-8e77-f5a1735560f6"). InnerVolumeSpecName "kube-api-access-gjt54". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 04:10:34 crc kubenswrapper[4703]: I1011 04:10:34.785583 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d4dbd298-477d-4565-8e77-f5a1735560f6-config-data" (OuterVolumeSpecName: "config-data") pod "d4dbd298-477d-4565-8e77-f5a1735560f6" (UID: "d4dbd298-477d-4565-8e77-f5a1735560f6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 04:10:34 crc kubenswrapper[4703]: I1011 04:10:34.843966 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gjt54\" (UniqueName: \"kubernetes.io/projected/d4dbd298-477d-4565-8e77-f5a1735560f6-kube-api-access-gjt54\") on node \"crc\" DevicePath \"\"" Oct 11 04:10:34 crc kubenswrapper[4703]: I1011 04:10:34.844015 4703 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4dbd298-477d-4565-8e77-f5a1735560f6-config-data\") on node \"crc\" DevicePath \"\"" Oct 11 04:10:34 crc kubenswrapper[4703]: I1011 04:10:34.844037 4703 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d4dbd298-477d-4565-8e77-f5a1735560f6-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 11 04:10:35 crc kubenswrapper[4703]: I1011 04:10:35.283027 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-6z8lk" event={"ID":"d4dbd298-477d-4565-8e77-f5a1735560f6","Type":"ContainerDied","Data":"070beec757cca0eee7a78b352e21be28d50393e81c2d68eb7eb5717e8be4c19e"} Oct 11 04:10:35 crc kubenswrapper[4703]: I1011 04:10:35.283096 4703 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="070beec757cca0eee7a78b352e21be28d50393e81c2d68eb7eb5717e8be4c19e" Oct 11 04:10:35 crc kubenswrapper[4703]: I1011 04:10:35.283091 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-6z8lk" Oct 11 04:10:36 crc kubenswrapper[4703]: I1011 04:10:36.723018 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Oct 11 04:10:36 crc kubenswrapper[4703]: E1011 04:10:36.723568 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4dbd298-477d-4565-8e77-f5a1735560f6" containerName="glance-db-sync" Oct 11 04:10:36 crc kubenswrapper[4703]: I1011 04:10:36.723581 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4dbd298-477d-4565-8e77-f5a1735560f6" containerName="glance-db-sync" Oct 11 04:10:36 crc kubenswrapper[4703]: I1011 04:10:36.723711 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="d4dbd298-477d-4565-8e77-f5a1735560f6" containerName="glance-db-sync" Oct 11 04:10:36 crc kubenswrapper[4703]: I1011 04:10:36.724393 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Oct 11 04:10:36 crc kubenswrapper[4703]: I1011 04:10:36.726370 4703 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-glance-dockercfg-7dkkv" Oct 11 04:10:36 crc kubenswrapper[4703]: I1011 04:10:36.726668 4703 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-default-single-config-data" Oct 11 04:10:36 crc kubenswrapper[4703]: I1011 04:10:36.727081 4703 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-scripts" Oct 11 04:10:36 crc kubenswrapper[4703]: I1011 04:10:36.745394 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Oct 11 04:10:36 crc kubenswrapper[4703]: I1011 04:10:36.753729 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-single-1"] Oct 11 04:10:36 crc kubenswrapper[4703]: I1011 04:10:36.755008 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-1" Oct 11 04:10:36 crc kubenswrapper[4703]: I1011 04:10:36.773943 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-single-1"] Oct 11 04:10:36 crc kubenswrapper[4703]: I1011 04:10:36.870495 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-single-0\" (UID: \"485cef99-d5fd-43f4-b3dd-7ecb7272b1f0\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 11 04:10:36 crc kubenswrapper[4703]: I1011 04:10:36.870777 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/256aa9ad-8dc4-4d46-ab51-d2453ef31d1d-dev\") pod \"glance-default-single-1\" (UID: \"256aa9ad-8dc4-4d46-ab51-d2453ef31d1d\") " pod="glance-kuttl-tests/glance-default-single-1" Oct 11 04:10:36 crc kubenswrapper[4703]: I1011 04:10:36.870868 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/256aa9ad-8dc4-4d46-ab51-d2453ef31d1d-run\") pod \"glance-default-single-1\" (UID: \"256aa9ad-8dc4-4d46-ab51-d2453ef31d1d\") " pod="glance-kuttl-tests/glance-default-single-1" Oct 11 04:10:36 crc kubenswrapper[4703]: I1011 04:10:36.870955 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/256aa9ad-8dc4-4d46-ab51-d2453ef31d1d-logs\") pod \"glance-default-single-1\" (UID: \"256aa9ad-8dc4-4d46-ab51-d2453ef31d1d\") " pod="glance-kuttl-tests/glance-default-single-1" Oct 11 04:10:36 crc kubenswrapper[4703]: I1011 04:10:36.871050 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-single-1\" (UID: \"256aa9ad-8dc4-4d46-ab51-d2453ef31d1d\") " pod="glance-kuttl-tests/glance-default-single-1" Oct 11 04:10:36 crc kubenswrapper[4703]: I1011 04:10:36.871119 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/485cef99-d5fd-43f4-b3dd-7ecb7272b1f0-etc-nvme\") pod \"glance-default-single-0\" (UID: \"485cef99-d5fd-43f4-b3dd-7ecb7272b1f0\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 11 04:10:36 crc kubenswrapper[4703]: I1011 04:10:36.871212 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/256aa9ad-8dc4-4d46-ab51-d2453ef31d1d-var-locks-brick\") pod \"glance-default-single-1\" (UID: \"256aa9ad-8dc4-4d46-ab51-d2453ef31d1d\") " pod="glance-kuttl-tests/glance-default-single-1" Oct 11 04:10:36 crc kubenswrapper[4703]: I1011 04:10:36.871301 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/485cef99-d5fd-43f4-b3dd-7ecb7272b1f0-sys\") pod \"glance-default-single-0\" (UID: \"485cef99-d5fd-43f4-b3dd-7ecb7272b1f0\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 11 04:10:36 crc kubenswrapper[4703]: I1011 04:10:36.871376 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/256aa9ad-8dc4-4d46-ab51-d2453ef31d1d-sys\") pod \"glance-default-single-1\" (UID: \"256aa9ad-8dc4-4d46-ab51-d2453ef31d1d\") " pod="glance-kuttl-tests/glance-default-single-1" Oct 11 04:10:36 crc kubenswrapper[4703]: I1011 04:10:36.871569 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/256aa9ad-8dc4-4d46-ab51-d2453ef31d1d-lib-modules\") pod \"glance-default-single-1\" (UID: \"256aa9ad-8dc4-4d46-ab51-d2453ef31d1d\") " pod="glance-kuttl-tests/glance-default-single-1" Oct 11 04:10:36 crc kubenswrapper[4703]: I1011 04:10:36.871654 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/256aa9ad-8dc4-4d46-ab51-d2453ef31d1d-httpd-run\") pod \"glance-default-single-1\" (UID: \"256aa9ad-8dc4-4d46-ab51-d2453ef31d1d\") " pod="glance-kuttl-tests/glance-default-single-1" Oct 11 04:10:36 crc kubenswrapper[4703]: I1011 04:10:36.871732 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/256aa9ad-8dc4-4d46-ab51-d2453ef31d1d-etc-iscsi\") pod \"glance-default-single-1\" (UID: \"256aa9ad-8dc4-4d46-ab51-d2453ef31d1d\") " pod="glance-kuttl-tests/glance-default-single-1" Oct 11 04:10:36 crc kubenswrapper[4703]: I1011 04:10:36.871812 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/485cef99-d5fd-43f4-b3dd-7ecb7272b1f0-logs\") pod \"glance-default-single-0\" (UID: \"485cef99-d5fd-43f4-b3dd-7ecb7272b1f0\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 11 04:10:36 crc kubenswrapper[4703]: I1011 04:10:36.871912 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/485cef99-d5fd-43f4-b3dd-7ecb7272b1f0-config-data\") pod \"glance-default-single-0\" (UID: \"485cef99-d5fd-43f4-b3dd-7ecb7272b1f0\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 11 04:10:36 crc kubenswrapper[4703]: I1011 04:10:36.872022 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/485cef99-d5fd-43f4-b3dd-7ecb7272b1f0-scripts\") pod \"glance-default-single-0\" (UID: \"485cef99-d5fd-43f4-b3dd-7ecb7272b1f0\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 11 04:10:36 crc kubenswrapper[4703]: I1011 04:10:36.872100 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/485cef99-d5fd-43f4-b3dd-7ecb7272b1f0-httpd-run\") pod \"glance-default-single-0\" (UID: \"485cef99-d5fd-43f4-b3dd-7ecb7272b1f0\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 11 04:10:36 crc kubenswrapper[4703]: I1011 04:10:36.872175 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/256aa9ad-8dc4-4d46-ab51-d2453ef31d1d-etc-nvme\") pod \"glance-default-single-1\" (UID: \"256aa9ad-8dc4-4d46-ab51-d2453ef31d1d\") " pod="glance-kuttl-tests/glance-default-single-1" Oct 11 04:10:36 crc kubenswrapper[4703]: I1011 04:10:36.872251 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/485cef99-d5fd-43f4-b3dd-7ecb7272b1f0-lib-modules\") pod \"glance-default-single-0\" (UID: \"485cef99-d5fd-43f4-b3dd-7ecb7272b1f0\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 11 04:10:36 crc kubenswrapper[4703]: I1011 04:10:36.872339 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-single-0\" (UID: \"485cef99-d5fd-43f4-b3dd-7ecb7272b1f0\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 11 04:10:36 crc kubenswrapper[4703]: I1011 04:10:36.872417 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/485cef99-d5fd-43f4-b3dd-7ecb7272b1f0-var-locks-brick\") pod \"glance-default-single-0\" (UID: \"485cef99-d5fd-43f4-b3dd-7ecb7272b1f0\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 11 04:10:36 crc kubenswrapper[4703]: I1011 04:10:36.872570 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jt4dr\" (UniqueName: \"kubernetes.io/projected/256aa9ad-8dc4-4d46-ab51-d2453ef31d1d-kube-api-access-jt4dr\") pod \"glance-default-single-1\" (UID: \"256aa9ad-8dc4-4d46-ab51-d2453ef31d1d\") " pod="glance-kuttl-tests/glance-default-single-1" Oct 11 04:10:36 crc kubenswrapper[4703]: I1011 04:10:36.872608 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/485cef99-d5fd-43f4-b3dd-7ecb7272b1f0-etc-iscsi\") pod \"glance-default-single-0\" (UID: \"485cef99-d5fd-43f4-b3dd-7ecb7272b1f0\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 11 04:10:36 crc kubenswrapper[4703]: I1011 04:10:36.872634 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/256aa9ad-8dc4-4d46-ab51-d2453ef31d1d-config-data\") pod \"glance-default-single-1\" (UID: \"256aa9ad-8dc4-4d46-ab51-d2453ef31d1d\") " pod="glance-kuttl-tests/glance-default-single-1" Oct 11 04:10:36 crc kubenswrapper[4703]: I1011 04:10:36.872707 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/485cef99-d5fd-43f4-b3dd-7ecb7272b1f0-run\") pod \"glance-default-single-0\" (UID: \"485cef99-d5fd-43f4-b3dd-7ecb7272b1f0\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 11 04:10:36 crc kubenswrapper[4703]: I1011 04:10:36.872755 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/485cef99-d5fd-43f4-b3dd-7ecb7272b1f0-dev\") pod \"glance-default-single-0\" (UID: \"485cef99-d5fd-43f4-b3dd-7ecb7272b1f0\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 11 04:10:36 crc kubenswrapper[4703]: I1011 04:10:36.872792 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-single-1\" (UID: \"256aa9ad-8dc4-4d46-ab51-d2453ef31d1d\") " pod="glance-kuttl-tests/glance-default-single-1" Oct 11 04:10:36 crc kubenswrapper[4703]: I1011 04:10:36.872816 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/256aa9ad-8dc4-4d46-ab51-d2453ef31d1d-scripts\") pod \"glance-default-single-1\" (UID: \"256aa9ad-8dc4-4d46-ab51-d2453ef31d1d\") " pod="glance-kuttl-tests/glance-default-single-1" Oct 11 04:10:36 crc kubenswrapper[4703]: I1011 04:10:36.872845 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jmhcc\" (UniqueName: \"kubernetes.io/projected/485cef99-d5fd-43f4-b3dd-7ecb7272b1f0-kube-api-access-jmhcc\") pod \"glance-default-single-0\" (UID: \"485cef99-d5fd-43f4-b3dd-7ecb7272b1f0\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 11 04:10:36 crc kubenswrapper[4703]: I1011 04:10:36.953220 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-single-1"] Oct 11 04:10:36 crc kubenswrapper[4703]: E1011 04:10:36.953802 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[config-data dev etc-iscsi etc-nvme glance glance-cache httpd-run kube-api-access-jt4dr lib-modules logs run scripts sys var-locks-brick], unattached volumes=[], failed to process volumes=[]: context canceled" pod="glance-kuttl-tests/glance-default-single-1" podUID="256aa9ad-8dc4-4d46-ab51-d2453ef31d1d" Oct 11 04:10:36 crc kubenswrapper[4703]: I1011 04:10:36.974198 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/256aa9ad-8dc4-4d46-ab51-d2453ef31d1d-logs\") pod \"glance-default-single-1\" (UID: \"256aa9ad-8dc4-4d46-ab51-d2453ef31d1d\") " pod="glance-kuttl-tests/glance-default-single-1" Oct 11 04:10:36 crc kubenswrapper[4703]: I1011 04:10:36.974267 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-single-1\" (UID: \"256aa9ad-8dc4-4d46-ab51-d2453ef31d1d\") " pod="glance-kuttl-tests/glance-default-single-1" Oct 11 04:10:36 crc kubenswrapper[4703]: I1011 04:10:36.974307 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/485cef99-d5fd-43f4-b3dd-7ecb7272b1f0-etc-nvme\") pod \"glance-default-single-0\" (UID: \"485cef99-d5fd-43f4-b3dd-7ecb7272b1f0\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 11 04:10:36 crc kubenswrapper[4703]: I1011 04:10:36.974358 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/256aa9ad-8dc4-4d46-ab51-d2453ef31d1d-var-locks-brick\") pod \"glance-default-single-1\" (UID: \"256aa9ad-8dc4-4d46-ab51-d2453ef31d1d\") " pod="glance-kuttl-tests/glance-default-single-1" Oct 11 04:10:36 crc kubenswrapper[4703]: I1011 04:10:36.974409 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/485cef99-d5fd-43f4-b3dd-7ecb7272b1f0-sys\") pod \"glance-default-single-0\" (UID: \"485cef99-d5fd-43f4-b3dd-7ecb7272b1f0\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 11 04:10:36 crc kubenswrapper[4703]: I1011 04:10:36.974441 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/256aa9ad-8dc4-4d46-ab51-d2453ef31d1d-sys\") pod \"glance-default-single-1\" (UID: \"256aa9ad-8dc4-4d46-ab51-d2453ef31d1d\") " pod="glance-kuttl-tests/glance-default-single-1" Oct 11 04:10:36 crc kubenswrapper[4703]: I1011 04:10:36.974552 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/256aa9ad-8dc4-4d46-ab51-d2453ef31d1d-lib-modules\") pod \"glance-default-single-1\" (UID: \"256aa9ad-8dc4-4d46-ab51-d2453ef31d1d\") " pod="glance-kuttl-tests/glance-default-single-1" Oct 11 04:10:36 crc kubenswrapper[4703]: I1011 04:10:36.974599 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/256aa9ad-8dc4-4d46-ab51-d2453ef31d1d-etc-iscsi\") pod \"glance-default-single-1\" (UID: \"256aa9ad-8dc4-4d46-ab51-d2453ef31d1d\") " pod="glance-kuttl-tests/glance-default-single-1" Oct 11 04:10:36 crc kubenswrapper[4703]: I1011 04:10:36.974639 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/256aa9ad-8dc4-4d46-ab51-d2453ef31d1d-httpd-run\") pod \"glance-default-single-1\" (UID: \"256aa9ad-8dc4-4d46-ab51-d2453ef31d1d\") " pod="glance-kuttl-tests/glance-default-single-1" Oct 11 04:10:36 crc kubenswrapper[4703]: I1011 04:10:36.974697 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/485cef99-d5fd-43f4-b3dd-7ecb7272b1f0-logs\") pod \"glance-default-single-0\" (UID: \"485cef99-d5fd-43f4-b3dd-7ecb7272b1f0\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 11 04:10:36 crc kubenswrapper[4703]: I1011 04:10:36.974746 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/256aa9ad-8dc4-4d46-ab51-d2453ef31d1d-logs\") pod \"glance-default-single-1\" (UID: \"256aa9ad-8dc4-4d46-ab51-d2453ef31d1d\") " pod="glance-kuttl-tests/glance-default-single-1" Oct 11 04:10:36 crc kubenswrapper[4703]: I1011 04:10:36.974747 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/485cef99-d5fd-43f4-b3dd-7ecb7272b1f0-config-data\") pod \"glance-default-single-0\" (UID: \"485cef99-d5fd-43f4-b3dd-7ecb7272b1f0\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 11 04:10:36 crc kubenswrapper[4703]: I1011 04:10:36.974830 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/485cef99-d5fd-43f4-b3dd-7ecb7272b1f0-scripts\") pod \"glance-default-single-0\" (UID: \"485cef99-d5fd-43f4-b3dd-7ecb7272b1f0\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 11 04:10:36 crc kubenswrapper[4703]: I1011 04:10:36.974859 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/485cef99-d5fd-43f4-b3dd-7ecb7272b1f0-httpd-run\") pod \"glance-default-single-0\" (UID: \"485cef99-d5fd-43f4-b3dd-7ecb7272b1f0\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 11 04:10:36 crc kubenswrapper[4703]: I1011 04:10:36.974888 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/256aa9ad-8dc4-4d46-ab51-d2453ef31d1d-etc-nvme\") pod \"glance-default-single-1\" (UID: \"256aa9ad-8dc4-4d46-ab51-d2453ef31d1d\") " pod="glance-kuttl-tests/glance-default-single-1" Oct 11 04:10:36 crc kubenswrapper[4703]: I1011 04:10:36.974906 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/485cef99-d5fd-43f4-b3dd-7ecb7272b1f0-lib-modules\") pod \"glance-default-single-0\" (UID: \"485cef99-d5fd-43f4-b3dd-7ecb7272b1f0\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 11 04:10:36 crc kubenswrapper[4703]: I1011 04:10:36.974946 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-single-0\" (UID: \"485cef99-d5fd-43f4-b3dd-7ecb7272b1f0\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 11 04:10:36 crc kubenswrapper[4703]: I1011 04:10:36.974976 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/485cef99-d5fd-43f4-b3dd-7ecb7272b1f0-var-locks-brick\") pod \"glance-default-single-0\" (UID: \"485cef99-d5fd-43f4-b3dd-7ecb7272b1f0\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 11 04:10:36 crc kubenswrapper[4703]: I1011 04:10:36.975022 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jt4dr\" (UniqueName: \"kubernetes.io/projected/256aa9ad-8dc4-4d46-ab51-d2453ef31d1d-kube-api-access-jt4dr\") pod \"glance-default-single-1\" (UID: \"256aa9ad-8dc4-4d46-ab51-d2453ef31d1d\") " pod="glance-kuttl-tests/glance-default-single-1" Oct 11 04:10:36 crc kubenswrapper[4703]: I1011 04:10:36.975042 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/485cef99-d5fd-43f4-b3dd-7ecb7272b1f0-etc-iscsi\") pod \"glance-default-single-0\" (UID: \"485cef99-d5fd-43f4-b3dd-7ecb7272b1f0\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 11 04:10:36 crc kubenswrapper[4703]: I1011 04:10:36.975058 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/256aa9ad-8dc4-4d46-ab51-d2453ef31d1d-config-data\") pod \"glance-default-single-1\" (UID: \"256aa9ad-8dc4-4d46-ab51-d2453ef31d1d\") " pod="glance-kuttl-tests/glance-default-single-1" Oct 11 04:10:36 crc kubenswrapper[4703]: I1011 04:10:36.975078 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/485cef99-d5fd-43f4-b3dd-7ecb7272b1f0-run\") pod \"glance-default-single-0\" (UID: \"485cef99-d5fd-43f4-b3dd-7ecb7272b1f0\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 11 04:10:36 crc kubenswrapper[4703]: I1011 04:10:36.975103 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/485cef99-d5fd-43f4-b3dd-7ecb7272b1f0-dev\") pod \"glance-default-single-0\" (UID: \"485cef99-d5fd-43f4-b3dd-7ecb7272b1f0\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 11 04:10:36 crc kubenswrapper[4703]: I1011 04:10:36.975119 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-single-1\" (UID: \"256aa9ad-8dc4-4d46-ab51-d2453ef31d1d\") " pod="glance-kuttl-tests/glance-default-single-1" Oct 11 04:10:36 crc kubenswrapper[4703]: I1011 04:10:36.975135 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/256aa9ad-8dc4-4d46-ab51-d2453ef31d1d-scripts\") pod \"glance-default-single-1\" (UID: \"256aa9ad-8dc4-4d46-ab51-d2453ef31d1d\") " pod="glance-kuttl-tests/glance-default-single-1" Oct 11 04:10:36 crc kubenswrapper[4703]: I1011 04:10:36.975154 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jmhcc\" (UniqueName: \"kubernetes.io/projected/485cef99-d5fd-43f4-b3dd-7ecb7272b1f0-kube-api-access-jmhcc\") pod \"glance-default-single-0\" (UID: \"485cef99-d5fd-43f4-b3dd-7ecb7272b1f0\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 11 04:10:36 crc kubenswrapper[4703]: I1011 04:10:36.975178 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-single-0\" (UID: \"485cef99-d5fd-43f4-b3dd-7ecb7272b1f0\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 11 04:10:36 crc kubenswrapper[4703]: I1011 04:10:36.975216 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/256aa9ad-8dc4-4d46-ab51-d2453ef31d1d-dev\") pod \"glance-default-single-1\" (UID: \"256aa9ad-8dc4-4d46-ab51-d2453ef31d1d\") " pod="glance-kuttl-tests/glance-default-single-1" Oct 11 04:10:36 crc kubenswrapper[4703]: I1011 04:10:36.975238 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/256aa9ad-8dc4-4d46-ab51-d2453ef31d1d-run\") pod \"glance-default-single-1\" (UID: \"256aa9ad-8dc4-4d46-ab51-d2453ef31d1d\") " pod="glance-kuttl-tests/glance-default-single-1" Oct 11 04:10:36 crc kubenswrapper[4703]: I1011 04:10:36.975340 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/256aa9ad-8dc4-4d46-ab51-d2453ef31d1d-run\") pod \"glance-default-single-1\" (UID: \"256aa9ad-8dc4-4d46-ab51-d2453ef31d1d\") " pod="glance-kuttl-tests/glance-default-single-1" Oct 11 04:10:36 crc kubenswrapper[4703]: I1011 04:10:36.975667 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/485cef99-d5fd-43f4-b3dd-7ecb7272b1f0-etc-nvme\") pod \"glance-default-single-0\" (UID: \"485cef99-d5fd-43f4-b3dd-7ecb7272b1f0\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 11 04:10:36 crc kubenswrapper[4703]: I1011 04:10:36.975749 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/256aa9ad-8dc4-4d46-ab51-d2453ef31d1d-etc-nvme\") pod \"glance-default-single-1\" (UID: \"256aa9ad-8dc4-4d46-ab51-d2453ef31d1d\") " pod="glance-kuttl-tests/glance-default-single-1" Oct 11 04:10:36 crc kubenswrapper[4703]: I1011 04:10:36.975767 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/485cef99-d5fd-43f4-b3dd-7ecb7272b1f0-sys\") pod \"glance-default-single-0\" (UID: \"485cef99-d5fd-43f4-b3dd-7ecb7272b1f0\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 11 04:10:36 crc kubenswrapper[4703]: I1011 04:10:36.975811 4703 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-single-0\" (UID: \"485cef99-d5fd-43f4-b3dd-7ecb7272b1f0\") device mount path \"/mnt/openstack/pv11\"" pod="glance-kuttl-tests/glance-default-single-0" Oct 11 04:10:36 crc kubenswrapper[4703]: I1011 04:10:36.975867 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/485cef99-d5fd-43f4-b3dd-7ecb7272b1f0-run\") pod \"glance-default-single-0\" (UID: \"485cef99-d5fd-43f4-b3dd-7ecb7272b1f0\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 11 04:10:36 crc kubenswrapper[4703]: I1011 04:10:36.975890 4703 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-single-1\" (UID: \"256aa9ad-8dc4-4d46-ab51-d2453ef31d1d\") device mount path \"/mnt/openstack/pv01\"" pod="glance-kuttl-tests/glance-default-single-1" Oct 11 04:10:36 crc kubenswrapper[4703]: I1011 04:10:36.975856 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/485cef99-d5fd-43f4-b3dd-7ecb7272b1f0-lib-modules\") pod \"glance-default-single-0\" (UID: \"485cef99-d5fd-43f4-b3dd-7ecb7272b1f0\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 11 04:10:36 crc kubenswrapper[4703]: I1011 04:10:36.975929 4703 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-single-0\" (UID: \"485cef99-d5fd-43f4-b3dd-7ecb7272b1f0\") device mount path \"/mnt/openstack/pv10\"" pod="glance-kuttl-tests/glance-default-single-0" Oct 11 04:10:36 crc kubenswrapper[4703]: I1011 04:10:36.976076 4703 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-single-1\" (UID: \"256aa9ad-8dc4-4d46-ab51-d2453ef31d1d\") device mount path \"/mnt/openstack/pv02\"" pod="glance-kuttl-tests/glance-default-single-1" Oct 11 04:10:36 crc kubenswrapper[4703]: I1011 04:10:36.976387 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/256aa9ad-8dc4-4d46-ab51-d2453ef31d1d-dev\") pod \"glance-default-single-1\" (UID: \"256aa9ad-8dc4-4d46-ab51-d2453ef31d1d\") " pod="glance-kuttl-tests/glance-default-single-1" Oct 11 04:10:36 crc kubenswrapper[4703]: I1011 04:10:36.976498 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/256aa9ad-8dc4-4d46-ab51-d2453ef31d1d-etc-iscsi\") pod \"glance-default-single-1\" (UID: \"256aa9ad-8dc4-4d46-ab51-d2453ef31d1d\") " pod="glance-kuttl-tests/glance-default-single-1" Oct 11 04:10:36 crc kubenswrapper[4703]: I1011 04:10:36.976550 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/256aa9ad-8dc4-4d46-ab51-d2453ef31d1d-lib-modules\") pod \"glance-default-single-1\" (UID: \"256aa9ad-8dc4-4d46-ab51-d2453ef31d1d\") " pod="glance-kuttl-tests/glance-default-single-1" Oct 11 04:10:36 crc kubenswrapper[4703]: I1011 04:10:36.977002 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/256aa9ad-8dc4-4d46-ab51-d2453ef31d1d-httpd-run\") pod \"glance-default-single-1\" (UID: \"256aa9ad-8dc4-4d46-ab51-d2453ef31d1d\") " pod="glance-kuttl-tests/glance-default-single-1" Oct 11 04:10:36 crc kubenswrapper[4703]: I1011 04:10:36.977973 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/256aa9ad-8dc4-4d46-ab51-d2453ef31d1d-sys\") pod \"glance-default-single-1\" (UID: \"256aa9ad-8dc4-4d46-ab51-d2453ef31d1d\") " pod="glance-kuttl-tests/glance-default-single-1" Oct 11 04:10:36 crc kubenswrapper[4703]: I1011 04:10:36.979054 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/256aa9ad-8dc4-4d46-ab51-d2453ef31d1d-scripts\") pod \"glance-default-single-1\" (UID: \"256aa9ad-8dc4-4d46-ab51-d2453ef31d1d\") " pod="glance-kuttl-tests/glance-default-single-1" Oct 11 04:10:36 crc kubenswrapper[4703]: I1011 04:10:36.979118 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/485cef99-d5fd-43f4-b3dd-7ecb7272b1f0-etc-iscsi\") pod \"glance-default-single-0\" (UID: \"485cef99-d5fd-43f4-b3dd-7ecb7272b1f0\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 11 04:10:36 crc kubenswrapper[4703]: I1011 04:10:36.975813 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/485cef99-d5fd-43f4-b3dd-7ecb7272b1f0-dev\") pod \"glance-default-single-0\" (UID: \"485cef99-d5fd-43f4-b3dd-7ecb7272b1f0\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 11 04:10:36 crc kubenswrapper[4703]: I1011 04:10:36.979204 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/485cef99-d5fd-43f4-b3dd-7ecb7272b1f0-scripts\") pod \"glance-default-single-0\" (UID: \"485cef99-d5fd-43f4-b3dd-7ecb7272b1f0\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 11 04:10:36 crc kubenswrapper[4703]: I1011 04:10:36.979824 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/485cef99-d5fd-43f4-b3dd-7ecb7272b1f0-httpd-run\") pod \"glance-default-single-0\" (UID: \"485cef99-d5fd-43f4-b3dd-7ecb7272b1f0\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 11 04:10:36 crc kubenswrapper[4703]: I1011 04:10:36.979811 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/256aa9ad-8dc4-4d46-ab51-d2453ef31d1d-var-locks-brick\") pod \"glance-default-single-1\" (UID: \"256aa9ad-8dc4-4d46-ab51-d2453ef31d1d\") " pod="glance-kuttl-tests/glance-default-single-1" Oct 11 04:10:36 crc kubenswrapper[4703]: I1011 04:10:36.980052 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/485cef99-d5fd-43f4-b3dd-7ecb7272b1f0-logs\") pod \"glance-default-single-0\" (UID: \"485cef99-d5fd-43f4-b3dd-7ecb7272b1f0\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 11 04:10:36 crc kubenswrapper[4703]: I1011 04:10:36.980307 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/485cef99-d5fd-43f4-b3dd-7ecb7272b1f0-var-locks-brick\") pod \"glance-default-single-0\" (UID: \"485cef99-d5fd-43f4-b3dd-7ecb7272b1f0\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 11 04:10:36 crc kubenswrapper[4703]: I1011 04:10:36.984075 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/256aa9ad-8dc4-4d46-ab51-d2453ef31d1d-config-data\") pod \"glance-default-single-1\" (UID: \"256aa9ad-8dc4-4d46-ab51-d2453ef31d1d\") " pod="glance-kuttl-tests/glance-default-single-1" Oct 11 04:10:37 crc kubenswrapper[4703]: I1011 04:10:37.004323 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/485cef99-d5fd-43f4-b3dd-7ecb7272b1f0-config-data\") pod \"glance-default-single-0\" (UID: \"485cef99-d5fd-43f4-b3dd-7ecb7272b1f0\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 11 04:10:37 crc kubenswrapper[4703]: I1011 04:10:37.020973 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jt4dr\" (UniqueName: \"kubernetes.io/projected/256aa9ad-8dc4-4d46-ab51-d2453ef31d1d-kube-api-access-jt4dr\") pod \"glance-default-single-1\" (UID: \"256aa9ad-8dc4-4d46-ab51-d2453ef31d1d\") " pod="glance-kuttl-tests/glance-default-single-1" Oct 11 04:10:37 crc kubenswrapper[4703]: I1011 04:10:37.027714 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jmhcc\" (UniqueName: \"kubernetes.io/projected/485cef99-d5fd-43f4-b3dd-7ecb7272b1f0-kube-api-access-jmhcc\") pod \"glance-default-single-0\" (UID: \"485cef99-d5fd-43f4-b3dd-7ecb7272b1f0\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 11 04:10:37 crc kubenswrapper[4703]: I1011 04:10:37.029897 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-single-1\" (UID: \"256aa9ad-8dc4-4d46-ab51-d2453ef31d1d\") " pod="glance-kuttl-tests/glance-default-single-1" Oct 11 04:10:37 crc kubenswrapper[4703]: I1011 04:10:37.031363 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-single-0\" (UID: \"485cef99-d5fd-43f4-b3dd-7ecb7272b1f0\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 11 04:10:37 crc kubenswrapper[4703]: I1011 04:10:37.032180 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-single-0\" (UID: \"485cef99-d5fd-43f4-b3dd-7ecb7272b1f0\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 11 04:10:37 crc kubenswrapper[4703]: I1011 04:10:37.042395 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-single-1\" (UID: \"256aa9ad-8dc4-4d46-ab51-d2453ef31d1d\") " pod="glance-kuttl-tests/glance-default-single-1" Oct 11 04:10:37 crc kubenswrapper[4703]: I1011 04:10:37.043760 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Oct 11 04:10:37 crc kubenswrapper[4703]: I1011 04:10:37.296848 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-1" Oct 11 04:10:37 crc kubenswrapper[4703]: I1011 04:10:37.307515 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-1" Oct 11 04:10:37 crc kubenswrapper[4703]: I1011 04:10:37.390743 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/256aa9ad-8dc4-4d46-ab51-d2453ef31d1d-lib-modules\") pod \"256aa9ad-8dc4-4d46-ab51-d2453ef31d1d\" (UID: \"256aa9ad-8dc4-4d46-ab51-d2453ef31d1d\") " Oct 11 04:10:37 crc kubenswrapper[4703]: I1011 04:10:37.390835 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/256aa9ad-8dc4-4d46-ab51-d2453ef31d1d-logs\") pod \"256aa9ad-8dc4-4d46-ab51-d2453ef31d1d\" (UID: \"256aa9ad-8dc4-4d46-ab51-d2453ef31d1d\") " Oct 11 04:10:37 crc kubenswrapper[4703]: I1011 04:10:37.390874 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/256aa9ad-8dc4-4d46-ab51-d2453ef31d1d-httpd-run\") pod \"256aa9ad-8dc4-4d46-ab51-d2453ef31d1d\" (UID: \"256aa9ad-8dc4-4d46-ab51-d2453ef31d1d\") " Oct 11 04:10:37 crc kubenswrapper[4703]: I1011 04:10:37.390910 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/256aa9ad-8dc4-4d46-ab51-d2453ef31d1d-run\") pod \"256aa9ad-8dc4-4d46-ab51-d2453ef31d1d\" (UID: \"256aa9ad-8dc4-4d46-ab51-d2453ef31d1d\") " Oct 11 04:10:37 crc kubenswrapper[4703]: I1011 04:10:37.390901 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/256aa9ad-8dc4-4d46-ab51-d2453ef31d1d-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "256aa9ad-8dc4-4d46-ab51-d2453ef31d1d" (UID: "256aa9ad-8dc4-4d46-ab51-d2453ef31d1d"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 11 04:10:37 crc kubenswrapper[4703]: I1011 04:10:37.390945 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/256aa9ad-8dc4-4d46-ab51-d2453ef31d1d-scripts\") pod \"256aa9ad-8dc4-4d46-ab51-d2453ef31d1d\" (UID: \"256aa9ad-8dc4-4d46-ab51-d2453ef31d1d\") " Oct 11 04:10:37 crc kubenswrapper[4703]: I1011 04:10:37.390965 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/256aa9ad-8dc4-4d46-ab51-d2453ef31d1d-run" (OuterVolumeSpecName: "run") pod "256aa9ad-8dc4-4d46-ab51-d2453ef31d1d" (UID: "256aa9ad-8dc4-4d46-ab51-d2453ef31d1d"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 11 04:10:37 crc kubenswrapper[4703]: I1011 04:10:37.391002 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/256aa9ad-8dc4-4d46-ab51-d2453ef31d1d-sys\") pod \"256aa9ad-8dc4-4d46-ab51-d2453ef31d1d\" (UID: \"256aa9ad-8dc4-4d46-ab51-d2453ef31d1d\") " Oct 11 04:10:37 crc kubenswrapper[4703]: I1011 04:10:37.391056 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/256aa9ad-8dc4-4d46-ab51-d2453ef31d1d-var-locks-brick\") pod \"256aa9ad-8dc4-4d46-ab51-d2453ef31d1d\" (UID: \"256aa9ad-8dc4-4d46-ab51-d2453ef31d1d\") " Oct 11 04:10:37 crc kubenswrapper[4703]: I1011 04:10:37.391094 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jt4dr\" (UniqueName: \"kubernetes.io/projected/256aa9ad-8dc4-4d46-ab51-d2453ef31d1d-kube-api-access-jt4dr\") pod \"256aa9ad-8dc4-4d46-ab51-d2453ef31d1d\" (UID: \"256aa9ad-8dc4-4d46-ab51-d2453ef31d1d\") " Oct 11 04:10:37 crc kubenswrapper[4703]: I1011 04:10:37.391098 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/256aa9ad-8dc4-4d46-ab51-d2453ef31d1d-sys" (OuterVolumeSpecName: "sys") pod "256aa9ad-8dc4-4d46-ab51-d2453ef31d1d" (UID: "256aa9ad-8dc4-4d46-ab51-d2453ef31d1d"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 11 04:10:37 crc kubenswrapper[4703]: I1011 04:10:37.391184 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/256aa9ad-8dc4-4d46-ab51-d2453ef31d1d-etc-iscsi\") pod \"256aa9ad-8dc4-4d46-ab51-d2453ef31d1d\" (UID: \"256aa9ad-8dc4-4d46-ab51-d2453ef31d1d\") " Oct 11 04:10:37 crc kubenswrapper[4703]: I1011 04:10:37.391218 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"256aa9ad-8dc4-4d46-ab51-d2453ef31d1d\" (UID: \"256aa9ad-8dc4-4d46-ab51-d2453ef31d1d\") " Oct 11 04:10:37 crc kubenswrapper[4703]: I1011 04:10:37.391117 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/256aa9ad-8dc4-4d46-ab51-d2453ef31d1d-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "256aa9ad-8dc4-4d46-ab51-d2453ef31d1d" (UID: "256aa9ad-8dc4-4d46-ab51-d2453ef31d1d"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 11 04:10:37 crc kubenswrapper[4703]: I1011 04:10:37.391167 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/256aa9ad-8dc4-4d46-ab51-d2453ef31d1d-logs" (OuterVolumeSpecName: "logs") pod "256aa9ad-8dc4-4d46-ab51-d2453ef31d1d" (UID: "256aa9ad-8dc4-4d46-ab51-d2453ef31d1d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 04:10:37 crc kubenswrapper[4703]: I1011 04:10:37.391189 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/256aa9ad-8dc4-4d46-ab51-d2453ef31d1d-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "256aa9ad-8dc4-4d46-ab51-d2453ef31d1d" (UID: "256aa9ad-8dc4-4d46-ab51-d2453ef31d1d"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 04:10:37 crc kubenswrapper[4703]: I1011 04:10:37.391214 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/256aa9ad-8dc4-4d46-ab51-d2453ef31d1d-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "256aa9ad-8dc4-4d46-ab51-d2453ef31d1d" (UID: "256aa9ad-8dc4-4d46-ab51-d2453ef31d1d"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 11 04:10:37 crc kubenswrapper[4703]: I1011 04:10:37.391251 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/256aa9ad-8dc4-4d46-ab51-d2453ef31d1d-dev\") pod \"256aa9ad-8dc4-4d46-ab51-d2453ef31d1d\" (UID: \"256aa9ad-8dc4-4d46-ab51-d2453ef31d1d\") " Oct 11 04:10:37 crc kubenswrapper[4703]: I1011 04:10:37.391298 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/256aa9ad-8dc4-4d46-ab51-d2453ef31d1d-dev" (OuterVolumeSpecName: "dev") pod "256aa9ad-8dc4-4d46-ab51-d2453ef31d1d" (UID: "256aa9ad-8dc4-4d46-ab51-d2453ef31d1d"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 11 04:10:37 crc kubenswrapper[4703]: I1011 04:10:37.391326 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/256aa9ad-8dc4-4d46-ab51-d2453ef31d1d-config-data\") pod \"256aa9ad-8dc4-4d46-ab51-d2453ef31d1d\" (UID: \"256aa9ad-8dc4-4d46-ab51-d2453ef31d1d\") " Oct 11 04:10:37 crc kubenswrapper[4703]: I1011 04:10:37.391396 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/256aa9ad-8dc4-4d46-ab51-d2453ef31d1d-etc-nvme\") pod \"256aa9ad-8dc4-4d46-ab51-d2453ef31d1d\" (UID: \"256aa9ad-8dc4-4d46-ab51-d2453ef31d1d\") " Oct 11 04:10:37 crc kubenswrapper[4703]: I1011 04:10:37.391453 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"256aa9ad-8dc4-4d46-ab51-d2453ef31d1d\" (UID: \"256aa9ad-8dc4-4d46-ab51-d2453ef31d1d\") " Oct 11 04:10:37 crc kubenswrapper[4703]: I1011 04:10:37.391491 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/256aa9ad-8dc4-4d46-ab51-d2453ef31d1d-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "256aa9ad-8dc4-4d46-ab51-d2453ef31d1d" (UID: "256aa9ad-8dc4-4d46-ab51-d2453ef31d1d"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 11 04:10:37 crc kubenswrapper[4703]: I1011 04:10:37.392085 4703 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/256aa9ad-8dc4-4d46-ab51-d2453ef31d1d-etc-iscsi\") on node \"crc\" DevicePath \"\"" Oct 11 04:10:37 crc kubenswrapper[4703]: I1011 04:10:37.392104 4703 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/256aa9ad-8dc4-4d46-ab51-d2453ef31d1d-dev\") on node \"crc\" DevicePath \"\"" Oct 11 04:10:37 crc kubenswrapper[4703]: I1011 04:10:37.392116 4703 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/256aa9ad-8dc4-4d46-ab51-d2453ef31d1d-etc-nvme\") on node \"crc\" DevicePath \"\"" Oct 11 04:10:37 crc kubenswrapper[4703]: I1011 04:10:37.392127 4703 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/256aa9ad-8dc4-4d46-ab51-d2453ef31d1d-lib-modules\") on node \"crc\" DevicePath \"\"" Oct 11 04:10:37 crc kubenswrapper[4703]: I1011 04:10:37.392138 4703 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/256aa9ad-8dc4-4d46-ab51-d2453ef31d1d-logs\") on node \"crc\" DevicePath \"\"" Oct 11 04:10:37 crc kubenswrapper[4703]: I1011 04:10:37.392148 4703 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/256aa9ad-8dc4-4d46-ab51-d2453ef31d1d-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 11 04:10:37 crc kubenswrapper[4703]: I1011 04:10:37.392160 4703 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/256aa9ad-8dc4-4d46-ab51-d2453ef31d1d-run\") on node \"crc\" DevicePath \"\"" Oct 11 04:10:37 crc kubenswrapper[4703]: I1011 04:10:37.392171 4703 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/256aa9ad-8dc4-4d46-ab51-d2453ef31d1d-sys\") on node \"crc\" DevicePath \"\"" Oct 11 04:10:37 crc kubenswrapper[4703]: I1011 04:10:37.392182 4703 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/256aa9ad-8dc4-4d46-ab51-d2453ef31d1d-var-locks-brick\") on node \"crc\" DevicePath \"\"" Oct 11 04:10:37 crc kubenswrapper[4703]: I1011 04:10:37.396299 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/256aa9ad-8dc4-4d46-ab51-d2453ef31d1d-kube-api-access-jt4dr" (OuterVolumeSpecName: "kube-api-access-jt4dr") pod "256aa9ad-8dc4-4d46-ab51-d2453ef31d1d" (UID: "256aa9ad-8dc4-4d46-ab51-d2453ef31d1d"). InnerVolumeSpecName "kube-api-access-jt4dr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 04:10:37 crc kubenswrapper[4703]: I1011 04:10:37.396546 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "glance-cache") pod "256aa9ad-8dc4-4d46-ab51-d2453ef31d1d" (UID: "256aa9ad-8dc4-4d46-ab51-d2453ef31d1d"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 11 04:10:37 crc kubenswrapper[4703]: I1011 04:10:37.397049 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/256aa9ad-8dc4-4d46-ab51-d2453ef31d1d-scripts" (OuterVolumeSpecName: "scripts") pod "256aa9ad-8dc4-4d46-ab51-d2453ef31d1d" (UID: "256aa9ad-8dc4-4d46-ab51-d2453ef31d1d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 04:10:37 crc kubenswrapper[4703]: I1011 04:10:37.397628 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "glance") pod "256aa9ad-8dc4-4d46-ab51-d2453ef31d1d" (UID: "256aa9ad-8dc4-4d46-ab51-d2453ef31d1d"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 11 04:10:37 crc kubenswrapper[4703]: I1011 04:10:37.398350 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/256aa9ad-8dc4-4d46-ab51-d2453ef31d1d-config-data" (OuterVolumeSpecName: "config-data") pod "256aa9ad-8dc4-4d46-ab51-d2453ef31d1d" (UID: "256aa9ad-8dc4-4d46-ab51-d2453ef31d1d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 04:10:37 crc kubenswrapper[4703]: I1011 04:10:37.489849 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Oct 11 04:10:37 crc kubenswrapper[4703]: I1011 04:10:37.493052 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jt4dr\" (UniqueName: \"kubernetes.io/projected/256aa9ad-8dc4-4d46-ab51-d2453ef31d1d-kube-api-access-jt4dr\") on node \"crc\" DevicePath \"\"" Oct 11 04:10:37 crc kubenswrapper[4703]: I1011 04:10:37.493091 4703 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Oct 11 04:10:37 crc kubenswrapper[4703]: I1011 04:10:37.493101 4703 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/256aa9ad-8dc4-4d46-ab51-d2453ef31d1d-config-data\") on node \"crc\" DevicePath \"\"" Oct 11 04:10:37 crc kubenswrapper[4703]: I1011 04:10:37.493114 4703 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Oct 11 04:10:37 crc kubenswrapper[4703]: I1011 04:10:37.493123 4703 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/256aa9ad-8dc4-4d46-ab51-d2453ef31d1d-scripts\") on node \"crc\" DevicePath \"\"" Oct 11 04:10:37 crc kubenswrapper[4703]: I1011 04:10:37.508422 4703 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Oct 11 04:10:37 crc kubenswrapper[4703]: I1011 04:10:37.511252 4703 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Oct 11 04:10:37 crc kubenswrapper[4703]: I1011 04:10:37.594255 4703 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Oct 11 04:10:37 crc kubenswrapper[4703]: I1011 04:10:37.594291 4703 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Oct 11 04:10:38 crc kubenswrapper[4703]: I1011 04:10:38.307938 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-1" Oct 11 04:10:38 crc kubenswrapper[4703]: I1011 04:10:38.307989 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"485cef99-d5fd-43f4-b3dd-7ecb7272b1f0","Type":"ContainerStarted","Data":"c90fd72a009fd474b732f091c1324f7ff8f257a20757a416564cc55ba70b6e8c"} Oct 11 04:10:38 crc kubenswrapper[4703]: I1011 04:10:38.308784 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"485cef99-d5fd-43f4-b3dd-7ecb7272b1f0","Type":"ContainerStarted","Data":"7f8abfbbd7636d423396b1c95a1fa19ff32a59bc06675d3d82c5d5c104d6a75c"} Oct 11 04:10:38 crc kubenswrapper[4703]: I1011 04:10:38.308808 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"485cef99-d5fd-43f4-b3dd-7ecb7272b1f0","Type":"ContainerStarted","Data":"37342e8d03d61bbc83246cfea217e64a8aba26bb36466c5e16891d4b2c598f95"} Oct 11 04:10:38 crc kubenswrapper[4703]: I1011 04:10:38.333189 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-single-0" podStartSLOduration=3.33316529 podStartE2EDuration="3.33316529s" podCreationTimestamp="2025-10-11 04:10:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 04:10:38.32972828 +0000 UTC m=+929.540210242" watchObservedRunningTime="2025-10-11 04:10:38.33316529 +0000 UTC m=+929.543647252" Oct 11 04:10:38 crc kubenswrapper[4703]: I1011 04:10:38.453183 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-single-1"] Oct 11 04:10:38 crc kubenswrapper[4703]: I1011 04:10:38.458105 4703 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-single-1"] Oct 11 04:10:38 crc kubenswrapper[4703]: I1011 04:10:38.482775 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-single-1"] Oct 11 04:10:38 crc kubenswrapper[4703]: I1011 04:10:38.484310 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-1" Oct 11 04:10:38 crc kubenswrapper[4703]: I1011 04:10:38.497496 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-single-1"] Oct 11 04:10:38 crc kubenswrapper[4703]: I1011 04:10:38.634347 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-single-1\" (UID: \"5895dcda-da2b-4c63-ae27-8d9a289718a2\") " pod="glance-kuttl-tests/glance-default-single-1" Oct 11 04:10:38 crc kubenswrapper[4703]: I1011 04:10:38.634399 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/5895dcda-da2b-4c63-ae27-8d9a289718a2-var-locks-brick\") pod \"glance-default-single-1\" (UID: \"5895dcda-da2b-4c63-ae27-8d9a289718a2\") " pod="glance-kuttl-tests/glance-default-single-1" Oct 11 04:10:38 crc kubenswrapper[4703]: I1011 04:10:38.634439 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/5895dcda-da2b-4c63-ae27-8d9a289718a2-etc-nvme\") pod \"glance-default-single-1\" (UID: \"5895dcda-da2b-4c63-ae27-8d9a289718a2\") " pod="glance-kuttl-tests/glance-default-single-1" Oct 11 04:10:38 crc kubenswrapper[4703]: I1011 04:10:38.634522 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5895dcda-da2b-4c63-ae27-8d9a289718a2-scripts\") pod \"glance-default-single-1\" (UID: \"5895dcda-da2b-4c63-ae27-8d9a289718a2\") " pod="glance-kuttl-tests/glance-default-single-1" Oct 11 04:10:38 crc kubenswrapper[4703]: I1011 04:10:38.634553 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nphxb\" (UniqueName: \"kubernetes.io/projected/5895dcda-da2b-4c63-ae27-8d9a289718a2-kube-api-access-nphxb\") pod \"glance-default-single-1\" (UID: \"5895dcda-da2b-4c63-ae27-8d9a289718a2\") " pod="glance-kuttl-tests/glance-default-single-1" Oct 11 04:10:38 crc kubenswrapper[4703]: I1011 04:10:38.634571 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/5895dcda-da2b-4c63-ae27-8d9a289718a2-dev\") pod \"glance-default-single-1\" (UID: \"5895dcda-da2b-4c63-ae27-8d9a289718a2\") " pod="glance-kuttl-tests/glance-default-single-1" Oct 11 04:10:38 crc kubenswrapper[4703]: I1011 04:10:38.634589 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/5895dcda-da2b-4c63-ae27-8d9a289718a2-etc-iscsi\") pod \"glance-default-single-1\" (UID: \"5895dcda-da2b-4c63-ae27-8d9a289718a2\") " pod="glance-kuttl-tests/glance-default-single-1" Oct 11 04:10:38 crc kubenswrapper[4703]: I1011 04:10:38.634629 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5895dcda-da2b-4c63-ae27-8d9a289718a2-sys\") pod \"glance-default-single-1\" (UID: \"5895dcda-da2b-4c63-ae27-8d9a289718a2\") " pod="glance-kuttl-tests/glance-default-single-1" Oct 11 04:10:38 crc kubenswrapper[4703]: I1011 04:10:38.634650 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/5895dcda-da2b-4c63-ae27-8d9a289718a2-run\") pod \"glance-default-single-1\" (UID: \"5895dcda-da2b-4c63-ae27-8d9a289718a2\") " pod="glance-kuttl-tests/glance-default-single-1" Oct 11 04:10:38 crc kubenswrapper[4703]: I1011 04:10:38.634666 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5895dcda-da2b-4c63-ae27-8d9a289718a2-httpd-run\") pod \"glance-default-single-1\" (UID: \"5895dcda-da2b-4c63-ae27-8d9a289718a2\") " pod="glance-kuttl-tests/glance-default-single-1" Oct 11 04:10:38 crc kubenswrapper[4703]: I1011 04:10:38.634731 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5895dcda-da2b-4c63-ae27-8d9a289718a2-logs\") pod \"glance-default-single-1\" (UID: \"5895dcda-da2b-4c63-ae27-8d9a289718a2\") " pod="glance-kuttl-tests/glance-default-single-1" Oct 11 04:10:38 crc kubenswrapper[4703]: I1011 04:10:38.634823 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-single-1\" (UID: \"5895dcda-da2b-4c63-ae27-8d9a289718a2\") " pod="glance-kuttl-tests/glance-default-single-1" Oct 11 04:10:38 crc kubenswrapper[4703]: I1011 04:10:38.634890 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/5895dcda-da2b-4c63-ae27-8d9a289718a2-lib-modules\") pod \"glance-default-single-1\" (UID: \"5895dcda-da2b-4c63-ae27-8d9a289718a2\") " pod="glance-kuttl-tests/glance-default-single-1" Oct 11 04:10:38 crc kubenswrapper[4703]: I1011 04:10:38.634938 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5895dcda-da2b-4c63-ae27-8d9a289718a2-config-data\") pod \"glance-default-single-1\" (UID: \"5895dcda-da2b-4c63-ae27-8d9a289718a2\") " pod="glance-kuttl-tests/glance-default-single-1" Oct 11 04:10:38 crc kubenswrapper[4703]: I1011 04:10:38.736211 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nphxb\" (UniqueName: \"kubernetes.io/projected/5895dcda-da2b-4c63-ae27-8d9a289718a2-kube-api-access-nphxb\") pod \"glance-default-single-1\" (UID: \"5895dcda-da2b-4c63-ae27-8d9a289718a2\") " pod="glance-kuttl-tests/glance-default-single-1" Oct 11 04:10:38 crc kubenswrapper[4703]: I1011 04:10:38.736258 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/5895dcda-da2b-4c63-ae27-8d9a289718a2-etc-iscsi\") pod \"glance-default-single-1\" (UID: \"5895dcda-da2b-4c63-ae27-8d9a289718a2\") " pod="glance-kuttl-tests/glance-default-single-1" Oct 11 04:10:38 crc kubenswrapper[4703]: I1011 04:10:38.736279 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/5895dcda-da2b-4c63-ae27-8d9a289718a2-dev\") pod \"glance-default-single-1\" (UID: \"5895dcda-da2b-4c63-ae27-8d9a289718a2\") " pod="glance-kuttl-tests/glance-default-single-1" Oct 11 04:10:38 crc kubenswrapper[4703]: I1011 04:10:38.736331 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5895dcda-da2b-4c63-ae27-8d9a289718a2-sys\") pod \"glance-default-single-1\" (UID: \"5895dcda-da2b-4c63-ae27-8d9a289718a2\") " pod="glance-kuttl-tests/glance-default-single-1" Oct 11 04:10:38 crc kubenswrapper[4703]: I1011 04:10:38.736351 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/5895dcda-da2b-4c63-ae27-8d9a289718a2-run\") pod \"glance-default-single-1\" (UID: \"5895dcda-da2b-4c63-ae27-8d9a289718a2\") " pod="glance-kuttl-tests/glance-default-single-1" Oct 11 04:10:38 crc kubenswrapper[4703]: I1011 04:10:38.736365 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5895dcda-da2b-4c63-ae27-8d9a289718a2-httpd-run\") pod \"glance-default-single-1\" (UID: \"5895dcda-da2b-4c63-ae27-8d9a289718a2\") " pod="glance-kuttl-tests/glance-default-single-1" Oct 11 04:10:38 crc kubenswrapper[4703]: I1011 04:10:38.736384 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5895dcda-da2b-4c63-ae27-8d9a289718a2-logs\") pod \"glance-default-single-1\" (UID: \"5895dcda-da2b-4c63-ae27-8d9a289718a2\") " pod="glance-kuttl-tests/glance-default-single-1" Oct 11 04:10:38 crc kubenswrapper[4703]: I1011 04:10:38.736401 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-single-1\" (UID: \"5895dcda-da2b-4c63-ae27-8d9a289718a2\") " pod="glance-kuttl-tests/glance-default-single-1" Oct 11 04:10:38 crc kubenswrapper[4703]: I1011 04:10:38.736426 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/5895dcda-da2b-4c63-ae27-8d9a289718a2-lib-modules\") pod \"glance-default-single-1\" (UID: \"5895dcda-da2b-4c63-ae27-8d9a289718a2\") " pod="glance-kuttl-tests/glance-default-single-1" Oct 11 04:10:38 crc kubenswrapper[4703]: I1011 04:10:38.736449 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5895dcda-da2b-4c63-ae27-8d9a289718a2-config-data\") pod \"glance-default-single-1\" (UID: \"5895dcda-da2b-4c63-ae27-8d9a289718a2\") " pod="glance-kuttl-tests/glance-default-single-1" Oct 11 04:10:38 crc kubenswrapper[4703]: I1011 04:10:38.736484 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-single-1\" (UID: \"5895dcda-da2b-4c63-ae27-8d9a289718a2\") " pod="glance-kuttl-tests/glance-default-single-1" Oct 11 04:10:38 crc kubenswrapper[4703]: I1011 04:10:38.736507 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/5895dcda-da2b-4c63-ae27-8d9a289718a2-var-locks-brick\") pod \"glance-default-single-1\" (UID: \"5895dcda-da2b-4c63-ae27-8d9a289718a2\") " pod="glance-kuttl-tests/glance-default-single-1" Oct 11 04:10:38 crc kubenswrapper[4703]: I1011 04:10:38.736526 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/5895dcda-da2b-4c63-ae27-8d9a289718a2-etc-nvme\") pod \"glance-default-single-1\" (UID: \"5895dcda-da2b-4c63-ae27-8d9a289718a2\") " pod="glance-kuttl-tests/glance-default-single-1" Oct 11 04:10:38 crc kubenswrapper[4703]: I1011 04:10:38.736564 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5895dcda-da2b-4c63-ae27-8d9a289718a2-scripts\") pod \"glance-default-single-1\" (UID: \"5895dcda-da2b-4c63-ae27-8d9a289718a2\") " pod="glance-kuttl-tests/glance-default-single-1" Oct 11 04:10:38 crc kubenswrapper[4703]: I1011 04:10:38.737500 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/5895dcda-da2b-4c63-ae27-8d9a289718a2-dev\") pod \"glance-default-single-1\" (UID: \"5895dcda-da2b-4c63-ae27-8d9a289718a2\") " pod="glance-kuttl-tests/glance-default-single-1" Oct 11 04:10:38 crc kubenswrapper[4703]: I1011 04:10:38.737549 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/5895dcda-da2b-4c63-ae27-8d9a289718a2-run\") pod \"glance-default-single-1\" (UID: \"5895dcda-da2b-4c63-ae27-8d9a289718a2\") " pod="glance-kuttl-tests/glance-default-single-1" Oct 11 04:10:38 crc kubenswrapper[4703]: I1011 04:10:38.737616 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/5895dcda-da2b-4c63-ae27-8d9a289718a2-etc-iscsi\") pod \"glance-default-single-1\" (UID: \"5895dcda-da2b-4c63-ae27-8d9a289718a2\") " pod="glance-kuttl-tests/glance-default-single-1" Oct 11 04:10:38 crc kubenswrapper[4703]: I1011 04:10:38.737669 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/5895dcda-da2b-4c63-ae27-8d9a289718a2-var-locks-brick\") pod \"glance-default-single-1\" (UID: \"5895dcda-da2b-4c63-ae27-8d9a289718a2\") " pod="glance-kuttl-tests/glance-default-single-1" Oct 11 04:10:38 crc kubenswrapper[4703]: I1011 04:10:38.737684 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/5895dcda-da2b-4c63-ae27-8d9a289718a2-etc-nvme\") pod \"glance-default-single-1\" (UID: \"5895dcda-da2b-4c63-ae27-8d9a289718a2\") " pod="glance-kuttl-tests/glance-default-single-1" Oct 11 04:10:38 crc kubenswrapper[4703]: I1011 04:10:38.737713 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5895dcda-da2b-4c63-ae27-8d9a289718a2-sys\") pod \"glance-default-single-1\" (UID: \"5895dcda-da2b-4c63-ae27-8d9a289718a2\") " pod="glance-kuttl-tests/glance-default-single-1" Oct 11 04:10:38 crc kubenswrapper[4703]: I1011 04:10:38.737735 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/5895dcda-da2b-4c63-ae27-8d9a289718a2-lib-modules\") pod \"glance-default-single-1\" (UID: \"5895dcda-da2b-4c63-ae27-8d9a289718a2\") " pod="glance-kuttl-tests/glance-default-single-1" Oct 11 04:10:38 crc kubenswrapper[4703]: I1011 04:10:38.737737 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5895dcda-da2b-4c63-ae27-8d9a289718a2-logs\") pod \"glance-default-single-1\" (UID: \"5895dcda-da2b-4c63-ae27-8d9a289718a2\") " pod="glance-kuttl-tests/glance-default-single-1" Oct 11 04:10:38 crc kubenswrapper[4703]: I1011 04:10:38.737828 4703 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-single-1\" (UID: \"5895dcda-da2b-4c63-ae27-8d9a289718a2\") device mount path \"/mnt/openstack/pv02\"" pod="glance-kuttl-tests/glance-default-single-1" Oct 11 04:10:38 crc kubenswrapper[4703]: I1011 04:10:38.737850 4703 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-single-1\" (UID: \"5895dcda-da2b-4c63-ae27-8d9a289718a2\") device mount path \"/mnt/openstack/pv01\"" pod="glance-kuttl-tests/glance-default-single-1" Oct 11 04:10:38 crc kubenswrapper[4703]: I1011 04:10:38.738099 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5895dcda-da2b-4c63-ae27-8d9a289718a2-httpd-run\") pod \"glance-default-single-1\" (UID: \"5895dcda-da2b-4c63-ae27-8d9a289718a2\") " pod="glance-kuttl-tests/glance-default-single-1" Oct 11 04:10:38 crc kubenswrapper[4703]: I1011 04:10:38.744242 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5895dcda-da2b-4c63-ae27-8d9a289718a2-scripts\") pod \"glance-default-single-1\" (UID: \"5895dcda-da2b-4c63-ae27-8d9a289718a2\") " pod="glance-kuttl-tests/glance-default-single-1" Oct 11 04:10:38 crc kubenswrapper[4703]: I1011 04:10:38.748790 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5895dcda-da2b-4c63-ae27-8d9a289718a2-config-data\") pod \"glance-default-single-1\" (UID: \"5895dcda-da2b-4c63-ae27-8d9a289718a2\") " pod="glance-kuttl-tests/glance-default-single-1" Oct 11 04:10:38 crc kubenswrapper[4703]: I1011 04:10:38.754589 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nphxb\" (UniqueName: \"kubernetes.io/projected/5895dcda-da2b-4c63-ae27-8d9a289718a2-kube-api-access-nphxb\") pod \"glance-default-single-1\" (UID: \"5895dcda-da2b-4c63-ae27-8d9a289718a2\") " pod="glance-kuttl-tests/glance-default-single-1" Oct 11 04:10:38 crc kubenswrapper[4703]: I1011 04:10:38.762304 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-single-1\" (UID: \"5895dcda-da2b-4c63-ae27-8d9a289718a2\") " pod="glance-kuttl-tests/glance-default-single-1" Oct 11 04:10:38 crc kubenswrapper[4703]: I1011 04:10:38.790194 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-single-1\" (UID: \"5895dcda-da2b-4c63-ae27-8d9a289718a2\") " pod="glance-kuttl-tests/glance-default-single-1" Oct 11 04:10:38 crc kubenswrapper[4703]: I1011 04:10:38.806429 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-1" Oct 11 04:10:39 crc kubenswrapper[4703]: I1011 04:10:39.118426 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-single-1"] Oct 11 04:10:39 crc kubenswrapper[4703]: I1011 04:10:39.326226 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-1" event={"ID":"5895dcda-da2b-4c63-ae27-8d9a289718a2","Type":"ContainerStarted","Data":"5c6003c5567d245fbd340e019bee8f3ab3d9caa6d8ebc3f228d621804fa732a5"} Oct 11 04:10:39 crc kubenswrapper[4703]: I1011 04:10:39.326595 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-1" event={"ID":"5895dcda-da2b-4c63-ae27-8d9a289718a2","Type":"ContainerStarted","Data":"68b184972cdc14279ac3c34c9bf15dda1de8dfa6797950a556262c51532ad65b"} Oct 11 04:10:39 crc kubenswrapper[4703]: I1011 04:10:39.551019 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="256aa9ad-8dc4-4d46-ab51-d2453ef31d1d" path="/var/lib/kubelet/pods/256aa9ad-8dc4-4d46-ab51-d2453ef31d1d/volumes" Oct 11 04:10:40 crc kubenswrapper[4703]: I1011 04:10:40.339632 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-1" event={"ID":"5895dcda-da2b-4c63-ae27-8d9a289718a2","Type":"ContainerStarted","Data":"6dfb24529618ae483ac5a3938f5bf499fba997d06cc621c5317a788fd55eb49d"} Oct 11 04:10:40 crc kubenswrapper[4703]: I1011 04:10:40.360758 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-single-1" podStartSLOduration=2.360739752 podStartE2EDuration="2.360739752s" podCreationTimestamp="2025-10-11 04:10:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 04:10:40.359302564 +0000 UTC m=+931.569784516" watchObservedRunningTime="2025-10-11 04:10:40.360739752 +0000 UTC m=+931.571221674" Oct 11 04:10:47 crc kubenswrapper[4703]: I1011 04:10:47.044387 4703 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-single-0" Oct 11 04:10:47 crc kubenswrapper[4703]: I1011 04:10:47.044928 4703 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-single-0" Oct 11 04:10:47 crc kubenswrapper[4703]: I1011 04:10:47.078037 4703 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-single-0" Oct 11 04:10:47 crc kubenswrapper[4703]: I1011 04:10:47.083113 4703 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-single-0" Oct 11 04:10:47 crc kubenswrapper[4703]: I1011 04:10:47.387753 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-single-0" Oct 11 04:10:47 crc kubenswrapper[4703]: I1011 04:10:47.387800 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-single-0" Oct 11 04:10:48 crc kubenswrapper[4703]: I1011 04:10:48.807523 4703 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-single-1" Oct 11 04:10:48 crc kubenswrapper[4703]: I1011 04:10:48.807763 4703 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-single-1" Oct 11 04:10:48 crc kubenswrapper[4703]: I1011 04:10:48.845081 4703 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-single-1" Oct 11 04:10:48 crc kubenswrapper[4703]: I1011 04:10:48.903614 4703 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-single-1" Oct 11 04:10:49 crc kubenswrapper[4703]: I1011 04:10:49.411134 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-single-1" Oct 11 04:10:49 crc kubenswrapper[4703]: I1011 04:10:49.411180 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-single-1" Oct 11 04:10:50 crc kubenswrapper[4703]: I1011 04:10:50.255200 4703 patch_prober.go:28] interesting pod/machine-config-daemon-6b7d5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 11 04:10:50 crc kubenswrapper[4703]: I1011 04:10:50.255594 4703 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6b7d5" podUID="74f832fc-6791-47d6-a9b3-07d923e053dc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 11 04:10:53 crc kubenswrapper[4703]: I1011 04:10:53.572634 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-single-0" Oct 11 04:10:53 crc kubenswrapper[4703]: I1011 04:10:53.574874 4703 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 11 04:10:53 crc kubenswrapper[4703]: I1011 04:10:53.636445 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-single-0" Oct 11 04:10:53 crc kubenswrapper[4703]: I1011 04:10:53.851376 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-single-1" Oct 11 04:10:53 crc kubenswrapper[4703]: I1011 04:10:53.851841 4703 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 11 04:10:53 crc kubenswrapper[4703]: I1011 04:10:53.916621 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-single-1" Oct 11 04:10:53 crc kubenswrapper[4703]: I1011 04:10:53.982893 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Oct 11 04:10:55 crc kubenswrapper[4703]: I1011 04:10:55.457411 4703 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-single-0" podUID="485cef99-d5fd-43f4-b3dd-7ecb7272b1f0" containerName="glance-httpd" containerID="cri-o://c90fd72a009fd474b732f091c1324f7ff8f257a20757a416564cc55ba70b6e8c" gracePeriod=30 Oct 11 04:10:55 crc kubenswrapper[4703]: I1011 04:10:55.457359 4703 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-single-0" podUID="485cef99-d5fd-43f4-b3dd-7ecb7272b1f0" containerName="glance-log" containerID="cri-o://7f8abfbbd7636d423396b1c95a1fa19ff32a59bc06675d3d82c5d5c104d6a75c" gracePeriod=30 Oct 11 04:10:55 crc kubenswrapper[4703]: I1011 04:10:55.463077 4703 prober.go:107] "Probe failed" probeType="Readiness" pod="glance-kuttl-tests/glance-default-single-0" podUID="485cef99-d5fd-43f4-b3dd-7ecb7272b1f0" containerName="glance-log" probeResult="failure" output="Get \"http://10.217.0.101:9292/healthcheck\": EOF" Oct 11 04:10:55 crc kubenswrapper[4703]: I1011 04:10:55.465297 4703 prober.go:107] "Probe failed" probeType="Readiness" pod="glance-kuttl-tests/glance-default-single-0" podUID="485cef99-d5fd-43f4-b3dd-7ecb7272b1f0" containerName="glance-httpd" probeResult="failure" output="Get \"http://10.217.0.101:9292/healthcheck\": EOF" Oct 11 04:10:56 crc kubenswrapper[4703]: I1011 04:10:56.469931 4703 generic.go:334] "Generic (PLEG): container finished" podID="485cef99-d5fd-43f4-b3dd-7ecb7272b1f0" containerID="7f8abfbbd7636d423396b1c95a1fa19ff32a59bc06675d3d82c5d5c104d6a75c" exitCode=143 Oct 11 04:10:56 crc kubenswrapper[4703]: I1011 04:10:56.470311 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"485cef99-d5fd-43f4-b3dd-7ecb7272b1f0","Type":"ContainerDied","Data":"7f8abfbbd7636d423396b1c95a1fa19ff32a59bc06675d3d82c5d5c104d6a75c"} Oct 11 04:10:59 crc kubenswrapper[4703]: I1011 04:10:59.348320 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Oct 11 04:10:59 crc kubenswrapper[4703]: I1011 04:10:59.452645 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/485cef99-d5fd-43f4-b3dd-7ecb7272b1f0-lib-modules\") pod \"485cef99-d5fd-43f4-b3dd-7ecb7272b1f0\" (UID: \"485cef99-d5fd-43f4-b3dd-7ecb7272b1f0\") " Oct 11 04:10:59 crc kubenswrapper[4703]: I1011 04:10:59.452691 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/485cef99-d5fd-43f4-b3dd-7ecb7272b1f0-config-data\") pod \"485cef99-d5fd-43f4-b3dd-7ecb7272b1f0\" (UID: \"485cef99-d5fd-43f4-b3dd-7ecb7272b1f0\") " Oct 11 04:10:59 crc kubenswrapper[4703]: I1011 04:10:59.452707 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/485cef99-d5fd-43f4-b3dd-7ecb7272b1f0-etc-iscsi\") pod \"485cef99-d5fd-43f4-b3dd-7ecb7272b1f0\" (UID: \"485cef99-d5fd-43f4-b3dd-7ecb7272b1f0\") " Oct 11 04:10:59 crc kubenswrapper[4703]: I1011 04:10:59.452725 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/485cef99-d5fd-43f4-b3dd-7ecb7272b1f0-sys\") pod \"485cef99-d5fd-43f4-b3dd-7ecb7272b1f0\" (UID: \"485cef99-d5fd-43f4-b3dd-7ecb7272b1f0\") " Oct 11 04:10:59 crc kubenswrapper[4703]: I1011 04:10:59.452764 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/485cef99-d5fd-43f4-b3dd-7ecb7272b1f0-etc-nvme\") pod \"485cef99-d5fd-43f4-b3dd-7ecb7272b1f0\" (UID: \"485cef99-d5fd-43f4-b3dd-7ecb7272b1f0\") " Oct 11 04:10:59 crc kubenswrapper[4703]: I1011 04:10:59.452791 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/485cef99-d5fd-43f4-b3dd-7ecb7272b1f0-var-locks-brick\") pod \"485cef99-d5fd-43f4-b3dd-7ecb7272b1f0\" (UID: \"485cef99-d5fd-43f4-b3dd-7ecb7272b1f0\") " Oct 11 04:10:59 crc kubenswrapper[4703]: I1011 04:10:59.452784 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/485cef99-d5fd-43f4-b3dd-7ecb7272b1f0-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "485cef99-d5fd-43f4-b3dd-7ecb7272b1f0" (UID: "485cef99-d5fd-43f4-b3dd-7ecb7272b1f0"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 11 04:10:59 crc kubenswrapper[4703]: I1011 04:10:59.452808 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/485cef99-d5fd-43f4-b3dd-7ecb7272b1f0-httpd-run\") pod \"485cef99-d5fd-43f4-b3dd-7ecb7272b1f0\" (UID: \"485cef99-d5fd-43f4-b3dd-7ecb7272b1f0\") " Oct 11 04:10:59 crc kubenswrapper[4703]: I1011 04:10:59.452828 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/485cef99-d5fd-43f4-b3dd-7ecb7272b1f0-sys" (OuterVolumeSpecName: "sys") pod "485cef99-d5fd-43f4-b3dd-7ecb7272b1f0" (UID: "485cef99-d5fd-43f4-b3dd-7ecb7272b1f0"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 11 04:10:59 crc kubenswrapper[4703]: I1011 04:10:59.452847 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/485cef99-d5fd-43f4-b3dd-7ecb7272b1f0-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "485cef99-d5fd-43f4-b3dd-7ecb7272b1f0" (UID: "485cef99-d5fd-43f4-b3dd-7ecb7272b1f0"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 11 04:10:59 crc kubenswrapper[4703]: I1011 04:10:59.452858 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/485cef99-d5fd-43f4-b3dd-7ecb7272b1f0-logs\") pod \"485cef99-d5fd-43f4-b3dd-7ecb7272b1f0\" (UID: \"485cef99-d5fd-43f4-b3dd-7ecb7272b1f0\") " Oct 11 04:10:59 crc kubenswrapper[4703]: I1011 04:10:59.452875 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/485cef99-d5fd-43f4-b3dd-7ecb7272b1f0-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "485cef99-d5fd-43f4-b3dd-7ecb7272b1f0" (UID: "485cef99-d5fd-43f4-b3dd-7ecb7272b1f0"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 11 04:10:59 crc kubenswrapper[4703]: I1011 04:10:59.452889 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/485cef99-d5fd-43f4-b3dd-7ecb7272b1f0-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "485cef99-d5fd-43f4-b3dd-7ecb7272b1f0" (UID: "485cef99-d5fd-43f4-b3dd-7ecb7272b1f0"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 11 04:10:59 crc kubenswrapper[4703]: I1011 04:10:59.452881 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/485cef99-d5fd-43f4-b3dd-7ecb7272b1f0-dev\") pod \"485cef99-d5fd-43f4-b3dd-7ecb7272b1f0\" (UID: \"485cef99-d5fd-43f4-b3dd-7ecb7272b1f0\") " Oct 11 04:10:59 crc kubenswrapper[4703]: I1011 04:10:59.452905 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/485cef99-d5fd-43f4-b3dd-7ecb7272b1f0-dev" (OuterVolumeSpecName: "dev") pod "485cef99-d5fd-43f4-b3dd-7ecb7272b1f0" (UID: "485cef99-d5fd-43f4-b3dd-7ecb7272b1f0"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 11 04:10:59 crc kubenswrapper[4703]: I1011 04:10:59.452936 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"485cef99-d5fd-43f4-b3dd-7ecb7272b1f0\" (UID: \"485cef99-d5fd-43f4-b3dd-7ecb7272b1f0\") " Oct 11 04:10:59 crc kubenswrapper[4703]: I1011 04:10:59.452975 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"485cef99-d5fd-43f4-b3dd-7ecb7272b1f0\" (UID: \"485cef99-d5fd-43f4-b3dd-7ecb7272b1f0\") " Oct 11 04:10:59 crc kubenswrapper[4703]: I1011 04:10:59.453001 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/485cef99-d5fd-43f4-b3dd-7ecb7272b1f0-run\") pod \"485cef99-d5fd-43f4-b3dd-7ecb7272b1f0\" (UID: \"485cef99-d5fd-43f4-b3dd-7ecb7272b1f0\") " Oct 11 04:10:59 crc kubenswrapper[4703]: I1011 04:10:59.453049 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/485cef99-d5fd-43f4-b3dd-7ecb7272b1f0-scripts\") pod \"485cef99-d5fd-43f4-b3dd-7ecb7272b1f0\" (UID: \"485cef99-d5fd-43f4-b3dd-7ecb7272b1f0\") " Oct 11 04:10:59 crc kubenswrapper[4703]: I1011 04:10:59.453085 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jmhcc\" (UniqueName: \"kubernetes.io/projected/485cef99-d5fd-43f4-b3dd-7ecb7272b1f0-kube-api-access-jmhcc\") pod \"485cef99-d5fd-43f4-b3dd-7ecb7272b1f0\" (UID: \"485cef99-d5fd-43f4-b3dd-7ecb7272b1f0\") " Oct 11 04:10:59 crc kubenswrapper[4703]: I1011 04:10:59.453137 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/485cef99-d5fd-43f4-b3dd-7ecb7272b1f0-run" (OuterVolumeSpecName: "run") pod "485cef99-d5fd-43f4-b3dd-7ecb7272b1f0" (UID: "485cef99-d5fd-43f4-b3dd-7ecb7272b1f0"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 11 04:10:59 crc kubenswrapper[4703]: I1011 04:10:59.453271 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/485cef99-d5fd-43f4-b3dd-7ecb7272b1f0-logs" (OuterVolumeSpecName: "logs") pod "485cef99-d5fd-43f4-b3dd-7ecb7272b1f0" (UID: "485cef99-d5fd-43f4-b3dd-7ecb7272b1f0"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 04:10:59 crc kubenswrapper[4703]: I1011 04:10:59.453417 4703 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/485cef99-d5fd-43f4-b3dd-7ecb7272b1f0-lib-modules\") on node \"crc\" DevicePath \"\"" Oct 11 04:10:59 crc kubenswrapper[4703]: I1011 04:10:59.453431 4703 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/485cef99-d5fd-43f4-b3dd-7ecb7272b1f0-etc-iscsi\") on node \"crc\" DevicePath \"\"" Oct 11 04:10:59 crc kubenswrapper[4703]: I1011 04:10:59.453440 4703 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/485cef99-d5fd-43f4-b3dd-7ecb7272b1f0-sys\") on node \"crc\" DevicePath \"\"" Oct 11 04:10:59 crc kubenswrapper[4703]: I1011 04:10:59.453449 4703 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/485cef99-d5fd-43f4-b3dd-7ecb7272b1f0-etc-nvme\") on node \"crc\" DevicePath \"\"" Oct 11 04:10:59 crc kubenswrapper[4703]: I1011 04:10:59.453457 4703 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/485cef99-d5fd-43f4-b3dd-7ecb7272b1f0-var-locks-brick\") on node \"crc\" DevicePath \"\"" Oct 11 04:10:59 crc kubenswrapper[4703]: I1011 04:10:59.453479 4703 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/485cef99-d5fd-43f4-b3dd-7ecb7272b1f0-logs\") on node \"crc\" DevicePath \"\"" Oct 11 04:10:59 crc kubenswrapper[4703]: I1011 04:10:59.453487 4703 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/485cef99-d5fd-43f4-b3dd-7ecb7272b1f0-dev\") on node \"crc\" DevicePath \"\"" Oct 11 04:10:59 crc kubenswrapper[4703]: I1011 04:10:59.453494 4703 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/485cef99-d5fd-43f4-b3dd-7ecb7272b1f0-run\") on node \"crc\" DevicePath \"\"" Oct 11 04:10:59 crc kubenswrapper[4703]: I1011 04:10:59.453700 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/485cef99-d5fd-43f4-b3dd-7ecb7272b1f0-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "485cef99-d5fd-43f4-b3dd-7ecb7272b1f0" (UID: "485cef99-d5fd-43f4-b3dd-7ecb7272b1f0"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 04:10:59 crc kubenswrapper[4703]: I1011 04:10:59.457893 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/485cef99-d5fd-43f4-b3dd-7ecb7272b1f0-scripts" (OuterVolumeSpecName: "scripts") pod "485cef99-d5fd-43f4-b3dd-7ecb7272b1f0" (UID: "485cef99-d5fd-43f4-b3dd-7ecb7272b1f0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 04:10:59 crc kubenswrapper[4703]: I1011 04:10:59.458551 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "glance-cache") pod "485cef99-d5fd-43f4-b3dd-7ecb7272b1f0" (UID: "485cef99-d5fd-43f4-b3dd-7ecb7272b1f0"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 11 04:10:59 crc kubenswrapper[4703]: I1011 04:10:59.459349 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/485cef99-d5fd-43f4-b3dd-7ecb7272b1f0-kube-api-access-jmhcc" (OuterVolumeSpecName: "kube-api-access-jmhcc") pod "485cef99-d5fd-43f4-b3dd-7ecb7272b1f0" (UID: "485cef99-d5fd-43f4-b3dd-7ecb7272b1f0"). InnerVolumeSpecName "kube-api-access-jmhcc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 04:10:59 crc kubenswrapper[4703]: I1011 04:10:59.460398 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "glance") pod "485cef99-d5fd-43f4-b3dd-7ecb7272b1f0" (UID: "485cef99-d5fd-43f4-b3dd-7ecb7272b1f0"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 11 04:10:59 crc kubenswrapper[4703]: I1011 04:10:59.500966 4703 generic.go:334] "Generic (PLEG): container finished" podID="485cef99-d5fd-43f4-b3dd-7ecb7272b1f0" containerID="c90fd72a009fd474b732f091c1324f7ff8f257a20757a416564cc55ba70b6e8c" exitCode=0 Oct 11 04:10:59 crc kubenswrapper[4703]: I1011 04:10:59.501248 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Oct 11 04:10:59 crc kubenswrapper[4703]: I1011 04:10:59.501298 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"485cef99-d5fd-43f4-b3dd-7ecb7272b1f0","Type":"ContainerDied","Data":"c90fd72a009fd474b732f091c1324f7ff8f257a20757a416564cc55ba70b6e8c"} Oct 11 04:10:59 crc kubenswrapper[4703]: I1011 04:10:59.502293 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"485cef99-d5fd-43f4-b3dd-7ecb7272b1f0","Type":"ContainerDied","Data":"37342e8d03d61bbc83246cfea217e64a8aba26bb36466c5e16891d4b2c598f95"} Oct 11 04:10:59 crc kubenswrapper[4703]: I1011 04:10:59.502338 4703 scope.go:117] "RemoveContainer" containerID="c90fd72a009fd474b732f091c1324f7ff8f257a20757a416564cc55ba70b6e8c" Oct 11 04:10:59 crc kubenswrapper[4703]: I1011 04:10:59.502324 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/485cef99-d5fd-43f4-b3dd-7ecb7272b1f0-config-data" (OuterVolumeSpecName: "config-data") pod "485cef99-d5fd-43f4-b3dd-7ecb7272b1f0" (UID: "485cef99-d5fd-43f4-b3dd-7ecb7272b1f0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 04:10:59 crc kubenswrapper[4703]: I1011 04:10:59.549320 4703 scope.go:117] "RemoveContainer" containerID="7f8abfbbd7636d423396b1c95a1fa19ff32a59bc06675d3d82c5d5c104d6a75c" Oct 11 04:10:59 crc kubenswrapper[4703]: I1011 04:10:59.555513 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jmhcc\" (UniqueName: \"kubernetes.io/projected/485cef99-d5fd-43f4-b3dd-7ecb7272b1f0-kube-api-access-jmhcc\") on node \"crc\" DevicePath \"\"" Oct 11 04:10:59 crc kubenswrapper[4703]: I1011 04:10:59.555543 4703 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/485cef99-d5fd-43f4-b3dd-7ecb7272b1f0-config-data\") on node \"crc\" DevicePath \"\"" Oct 11 04:10:59 crc kubenswrapper[4703]: I1011 04:10:59.555654 4703 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/485cef99-d5fd-43f4-b3dd-7ecb7272b1f0-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 11 04:10:59 crc kubenswrapper[4703]: I1011 04:10:59.555689 4703 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Oct 11 04:10:59 crc kubenswrapper[4703]: I1011 04:10:59.555707 4703 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Oct 11 04:10:59 crc kubenswrapper[4703]: I1011 04:10:59.555719 4703 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/485cef99-d5fd-43f4-b3dd-7ecb7272b1f0-scripts\") on node \"crc\" DevicePath \"\"" Oct 11 04:10:59 crc kubenswrapper[4703]: I1011 04:10:59.574038 4703 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Oct 11 04:10:59 crc kubenswrapper[4703]: I1011 04:10:59.578348 4703 scope.go:117] "RemoveContainer" containerID="c90fd72a009fd474b732f091c1324f7ff8f257a20757a416564cc55ba70b6e8c" Oct 11 04:10:59 crc kubenswrapper[4703]: E1011 04:10:59.578916 4703 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c90fd72a009fd474b732f091c1324f7ff8f257a20757a416564cc55ba70b6e8c\": container with ID starting with c90fd72a009fd474b732f091c1324f7ff8f257a20757a416564cc55ba70b6e8c not found: ID does not exist" containerID="c90fd72a009fd474b732f091c1324f7ff8f257a20757a416564cc55ba70b6e8c" Oct 11 04:10:59 crc kubenswrapper[4703]: I1011 04:10:59.578949 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c90fd72a009fd474b732f091c1324f7ff8f257a20757a416564cc55ba70b6e8c"} err="failed to get container status \"c90fd72a009fd474b732f091c1324f7ff8f257a20757a416564cc55ba70b6e8c\": rpc error: code = NotFound desc = could not find container \"c90fd72a009fd474b732f091c1324f7ff8f257a20757a416564cc55ba70b6e8c\": container with ID starting with c90fd72a009fd474b732f091c1324f7ff8f257a20757a416564cc55ba70b6e8c not found: ID does not exist" Oct 11 04:10:59 crc kubenswrapper[4703]: I1011 04:10:59.578970 4703 scope.go:117] "RemoveContainer" containerID="7f8abfbbd7636d423396b1c95a1fa19ff32a59bc06675d3d82c5d5c104d6a75c" Oct 11 04:10:59 crc kubenswrapper[4703]: E1011 04:10:59.579520 4703 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7f8abfbbd7636d423396b1c95a1fa19ff32a59bc06675d3d82c5d5c104d6a75c\": container with ID starting with 7f8abfbbd7636d423396b1c95a1fa19ff32a59bc06675d3d82c5d5c104d6a75c not found: ID does not exist" containerID="7f8abfbbd7636d423396b1c95a1fa19ff32a59bc06675d3d82c5d5c104d6a75c" Oct 11 04:10:59 crc kubenswrapper[4703]: I1011 04:10:59.579545 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7f8abfbbd7636d423396b1c95a1fa19ff32a59bc06675d3d82c5d5c104d6a75c"} err="failed to get container status \"7f8abfbbd7636d423396b1c95a1fa19ff32a59bc06675d3d82c5d5c104d6a75c\": rpc error: code = NotFound desc = could not find container \"7f8abfbbd7636d423396b1c95a1fa19ff32a59bc06675d3d82c5d5c104d6a75c\": container with ID starting with 7f8abfbbd7636d423396b1c95a1fa19ff32a59bc06675d3d82c5d5c104d6a75c not found: ID does not exist" Oct 11 04:10:59 crc kubenswrapper[4703]: I1011 04:10:59.585389 4703 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Oct 11 04:10:59 crc kubenswrapper[4703]: I1011 04:10:59.657320 4703 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Oct 11 04:10:59 crc kubenswrapper[4703]: I1011 04:10:59.657355 4703 reconciler_common.go:293] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Oct 11 04:10:59 crc kubenswrapper[4703]: I1011 04:10:59.822050 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Oct 11 04:10:59 crc kubenswrapper[4703]: I1011 04:10:59.841427 4703 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Oct 11 04:10:59 crc kubenswrapper[4703]: I1011 04:10:59.855033 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Oct 11 04:10:59 crc kubenswrapper[4703]: E1011 04:10:59.855353 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="485cef99-d5fd-43f4-b3dd-7ecb7272b1f0" containerName="glance-log" Oct 11 04:10:59 crc kubenswrapper[4703]: I1011 04:10:59.855375 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="485cef99-d5fd-43f4-b3dd-7ecb7272b1f0" containerName="glance-log" Oct 11 04:10:59 crc kubenswrapper[4703]: E1011 04:10:59.855399 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="485cef99-d5fd-43f4-b3dd-7ecb7272b1f0" containerName="glance-httpd" Oct 11 04:10:59 crc kubenswrapper[4703]: I1011 04:10:59.855408 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="485cef99-d5fd-43f4-b3dd-7ecb7272b1f0" containerName="glance-httpd" Oct 11 04:10:59 crc kubenswrapper[4703]: I1011 04:10:59.855678 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="485cef99-d5fd-43f4-b3dd-7ecb7272b1f0" containerName="glance-httpd" Oct 11 04:10:59 crc kubenswrapper[4703]: I1011 04:10:59.855693 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="485cef99-d5fd-43f4-b3dd-7ecb7272b1f0" containerName="glance-log" Oct 11 04:10:59 crc kubenswrapper[4703]: I1011 04:10:59.856577 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Oct 11 04:10:59 crc kubenswrapper[4703]: I1011 04:10:59.918255 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Oct 11 04:10:59 crc kubenswrapper[4703]: I1011 04:10:59.960805 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-single-0\" (UID: \"7407f9e5-80e5-4be4-afaa-399672f901d8\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 11 04:10:59 crc kubenswrapper[4703]: I1011 04:10:59.960857 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/7407f9e5-80e5-4be4-afaa-399672f901d8-etc-iscsi\") pod \"glance-default-single-0\" (UID: \"7407f9e5-80e5-4be4-afaa-399672f901d8\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 11 04:10:59 crc kubenswrapper[4703]: I1011 04:10:59.960891 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7407f9e5-80e5-4be4-afaa-399672f901d8-sys\") pod \"glance-default-single-0\" (UID: \"7407f9e5-80e5-4be4-afaa-399672f901d8\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 11 04:10:59 crc kubenswrapper[4703]: I1011 04:10:59.960919 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/7407f9e5-80e5-4be4-afaa-399672f901d8-var-locks-brick\") pod \"glance-default-single-0\" (UID: \"7407f9e5-80e5-4be4-afaa-399672f901d8\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 11 04:10:59 crc kubenswrapper[4703]: I1011 04:10:59.960982 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7407f9e5-80e5-4be4-afaa-399672f901d8-logs\") pod \"glance-default-single-0\" (UID: \"7407f9e5-80e5-4be4-afaa-399672f901d8\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 11 04:10:59 crc kubenswrapper[4703]: I1011 04:10:59.961011 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7407f9e5-80e5-4be4-afaa-399672f901d8-config-data\") pod \"glance-default-single-0\" (UID: \"7407f9e5-80e5-4be4-afaa-399672f901d8\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 11 04:10:59 crc kubenswrapper[4703]: I1011 04:10:59.961047 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n62v7\" (UniqueName: \"kubernetes.io/projected/7407f9e5-80e5-4be4-afaa-399672f901d8-kube-api-access-n62v7\") pod \"glance-default-single-0\" (UID: \"7407f9e5-80e5-4be4-afaa-399672f901d8\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 11 04:10:59 crc kubenswrapper[4703]: I1011 04:10:59.961071 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/7407f9e5-80e5-4be4-afaa-399672f901d8-dev\") pod \"glance-default-single-0\" (UID: \"7407f9e5-80e5-4be4-afaa-399672f901d8\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 11 04:10:59 crc kubenswrapper[4703]: I1011 04:10:59.961142 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/7407f9e5-80e5-4be4-afaa-399672f901d8-etc-nvme\") pod \"glance-default-single-0\" (UID: \"7407f9e5-80e5-4be4-afaa-399672f901d8\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 11 04:10:59 crc kubenswrapper[4703]: I1011 04:10:59.961169 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7407f9e5-80e5-4be4-afaa-399672f901d8-httpd-run\") pod \"glance-default-single-0\" (UID: \"7407f9e5-80e5-4be4-afaa-399672f901d8\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 11 04:10:59 crc kubenswrapper[4703]: I1011 04:10:59.961206 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/7407f9e5-80e5-4be4-afaa-399672f901d8-run\") pod \"glance-default-single-0\" (UID: \"7407f9e5-80e5-4be4-afaa-399672f901d8\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 11 04:10:59 crc kubenswrapper[4703]: I1011 04:10:59.961234 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7407f9e5-80e5-4be4-afaa-399672f901d8-scripts\") pod \"glance-default-single-0\" (UID: \"7407f9e5-80e5-4be4-afaa-399672f901d8\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 11 04:10:59 crc kubenswrapper[4703]: I1011 04:10:59.961420 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-single-0\" (UID: \"7407f9e5-80e5-4be4-afaa-399672f901d8\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 11 04:10:59 crc kubenswrapper[4703]: I1011 04:10:59.961526 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7407f9e5-80e5-4be4-afaa-399672f901d8-lib-modules\") pod \"glance-default-single-0\" (UID: \"7407f9e5-80e5-4be4-afaa-399672f901d8\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 11 04:11:00 crc kubenswrapper[4703]: I1011 04:11:00.063528 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7407f9e5-80e5-4be4-afaa-399672f901d8-scripts\") pod \"glance-default-single-0\" (UID: \"7407f9e5-80e5-4be4-afaa-399672f901d8\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 11 04:11:00 crc kubenswrapper[4703]: I1011 04:11:00.063590 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-single-0\" (UID: \"7407f9e5-80e5-4be4-afaa-399672f901d8\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 11 04:11:00 crc kubenswrapper[4703]: I1011 04:11:00.063624 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7407f9e5-80e5-4be4-afaa-399672f901d8-lib-modules\") pod \"glance-default-single-0\" (UID: \"7407f9e5-80e5-4be4-afaa-399672f901d8\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 11 04:11:00 crc kubenswrapper[4703]: I1011 04:11:00.063667 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-single-0\" (UID: \"7407f9e5-80e5-4be4-afaa-399672f901d8\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 11 04:11:00 crc kubenswrapper[4703]: I1011 04:11:00.063690 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/7407f9e5-80e5-4be4-afaa-399672f901d8-etc-iscsi\") pod \"glance-default-single-0\" (UID: \"7407f9e5-80e5-4be4-afaa-399672f901d8\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 11 04:11:00 crc kubenswrapper[4703]: I1011 04:11:00.063717 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7407f9e5-80e5-4be4-afaa-399672f901d8-sys\") pod \"glance-default-single-0\" (UID: \"7407f9e5-80e5-4be4-afaa-399672f901d8\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 11 04:11:00 crc kubenswrapper[4703]: I1011 04:11:00.063737 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/7407f9e5-80e5-4be4-afaa-399672f901d8-var-locks-brick\") pod \"glance-default-single-0\" (UID: \"7407f9e5-80e5-4be4-afaa-399672f901d8\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 11 04:11:00 crc kubenswrapper[4703]: I1011 04:11:00.063855 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7407f9e5-80e5-4be4-afaa-399672f901d8-logs\") pod \"glance-default-single-0\" (UID: \"7407f9e5-80e5-4be4-afaa-399672f901d8\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 11 04:11:00 crc kubenswrapper[4703]: I1011 04:11:00.063870 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/7407f9e5-80e5-4be4-afaa-399672f901d8-var-locks-brick\") pod \"glance-default-single-0\" (UID: \"7407f9e5-80e5-4be4-afaa-399672f901d8\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 11 04:11:00 crc kubenswrapper[4703]: I1011 04:11:00.063804 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7407f9e5-80e5-4be4-afaa-399672f901d8-sys\") pod \"glance-default-single-0\" (UID: \"7407f9e5-80e5-4be4-afaa-399672f901d8\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 11 04:11:00 crc kubenswrapper[4703]: I1011 04:11:00.063791 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7407f9e5-80e5-4be4-afaa-399672f901d8-lib-modules\") pod \"glance-default-single-0\" (UID: \"7407f9e5-80e5-4be4-afaa-399672f901d8\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 11 04:11:00 crc kubenswrapper[4703]: I1011 04:11:00.063921 4703 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-single-0\" (UID: \"7407f9e5-80e5-4be4-afaa-399672f901d8\") device mount path \"/mnt/openstack/pv10\"" pod="glance-kuttl-tests/glance-default-single-0" Oct 11 04:11:00 crc kubenswrapper[4703]: I1011 04:11:00.063791 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/7407f9e5-80e5-4be4-afaa-399672f901d8-etc-iscsi\") pod \"glance-default-single-0\" (UID: \"7407f9e5-80e5-4be4-afaa-399672f901d8\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 11 04:11:00 crc kubenswrapper[4703]: I1011 04:11:00.063912 4703 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-single-0\" (UID: \"7407f9e5-80e5-4be4-afaa-399672f901d8\") device mount path \"/mnt/openstack/pv11\"" pod="glance-kuttl-tests/glance-default-single-0" Oct 11 04:11:00 crc kubenswrapper[4703]: I1011 04:11:00.064334 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7407f9e5-80e5-4be4-afaa-399672f901d8-logs\") pod \"glance-default-single-0\" (UID: \"7407f9e5-80e5-4be4-afaa-399672f901d8\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 11 04:11:00 crc kubenswrapper[4703]: I1011 04:11:00.064381 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7407f9e5-80e5-4be4-afaa-399672f901d8-config-data\") pod \"glance-default-single-0\" (UID: \"7407f9e5-80e5-4be4-afaa-399672f901d8\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 11 04:11:00 crc kubenswrapper[4703]: I1011 04:11:00.064457 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n62v7\" (UniqueName: \"kubernetes.io/projected/7407f9e5-80e5-4be4-afaa-399672f901d8-kube-api-access-n62v7\") pod \"glance-default-single-0\" (UID: \"7407f9e5-80e5-4be4-afaa-399672f901d8\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 11 04:11:00 crc kubenswrapper[4703]: I1011 04:11:00.064781 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/7407f9e5-80e5-4be4-afaa-399672f901d8-dev\") pod \"glance-default-single-0\" (UID: \"7407f9e5-80e5-4be4-afaa-399672f901d8\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 11 04:11:00 crc kubenswrapper[4703]: I1011 04:11:00.064806 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/7407f9e5-80e5-4be4-afaa-399672f901d8-etc-nvme\") pod \"glance-default-single-0\" (UID: \"7407f9e5-80e5-4be4-afaa-399672f901d8\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 11 04:11:00 crc kubenswrapper[4703]: I1011 04:11:00.064867 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/7407f9e5-80e5-4be4-afaa-399672f901d8-dev\") pod \"glance-default-single-0\" (UID: \"7407f9e5-80e5-4be4-afaa-399672f901d8\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 11 04:11:00 crc kubenswrapper[4703]: I1011 04:11:00.064892 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/7407f9e5-80e5-4be4-afaa-399672f901d8-etc-nvme\") pod \"glance-default-single-0\" (UID: \"7407f9e5-80e5-4be4-afaa-399672f901d8\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 11 04:11:00 crc kubenswrapper[4703]: I1011 04:11:00.064840 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7407f9e5-80e5-4be4-afaa-399672f901d8-httpd-run\") pod \"glance-default-single-0\" (UID: \"7407f9e5-80e5-4be4-afaa-399672f901d8\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 11 04:11:00 crc kubenswrapper[4703]: I1011 04:11:00.064947 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/7407f9e5-80e5-4be4-afaa-399672f901d8-run\") pod \"glance-default-single-0\" (UID: \"7407f9e5-80e5-4be4-afaa-399672f901d8\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 11 04:11:00 crc kubenswrapper[4703]: I1011 04:11:00.065050 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/7407f9e5-80e5-4be4-afaa-399672f901d8-run\") pod \"glance-default-single-0\" (UID: \"7407f9e5-80e5-4be4-afaa-399672f901d8\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 11 04:11:00 crc kubenswrapper[4703]: I1011 04:11:00.065352 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7407f9e5-80e5-4be4-afaa-399672f901d8-httpd-run\") pod \"glance-default-single-0\" (UID: \"7407f9e5-80e5-4be4-afaa-399672f901d8\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 11 04:11:00 crc kubenswrapper[4703]: I1011 04:11:00.068368 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7407f9e5-80e5-4be4-afaa-399672f901d8-config-data\") pod \"glance-default-single-0\" (UID: \"7407f9e5-80e5-4be4-afaa-399672f901d8\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 11 04:11:00 crc kubenswrapper[4703]: I1011 04:11:00.070286 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7407f9e5-80e5-4be4-afaa-399672f901d8-scripts\") pod \"glance-default-single-0\" (UID: \"7407f9e5-80e5-4be4-afaa-399672f901d8\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 11 04:11:00 crc kubenswrapper[4703]: I1011 04:11:00.092224 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-single-0\" (UID: \"7407f9e5-80e5-4be4-afaa-399672f901d8\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 11 04:11:00 crc kubenswrapper[4703]: I1011 04:11:00.094717 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-single-0\" (UID: \"7407f9e5-80e5-4be4-afaa-399672f901d8\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 11 04:11:00 crc kubenswrapper[4703]: I1011 04:11:00.102173 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n62v7\" (UniqueName: \"kubernetes.io/projected/7407f9e5-80e5-4be4-afaa-399672f901d8-kube-api-access-n62v7\") pod \"glance-default-single-0\" (UID: \"7407f9e5-80e5-4be4-afaa-399672f901d8\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 11 04:11:00 crc kubenswrapper[4703]: I1011 04:11:00.172911 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Oct 11 04:11:00 crc kubenswrapper[4703]: I1011 04:11:00.638195 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Oct 11 04:11:00 crc kubenswrapper[4703]: W1011 04:11:00.650636 4703 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7407f9e5_80e5_4be4_afaa_399672f901d8.slice/crio-021cbd10115586b60d975cb9c4b6ac9537e87ca98eef9e4803978e2454bd32d5 WatchSource:0}: Error finding container 021cbd10115586b60d975cb9c4b6ac9537e87ca98eef9e4803978e2454bd32d5: Status 404 returned error can't find the container with id 021cbd10115586b60d975cb9c4b6ac9537e87ca98eef9e4803978e2454bd32d5 Oct 11 04:11:01 crc kubenswrapper[4703]: I1011 04:11:01.518854 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"7407f9e5-80e5-4be4-afaa-399672f901d8","Type":"ContainerStarted","Data":"2a43002367aede56b4c7061f0b163d76259dca2693e670660fca171e43a8f739"} Oct 11 04:11:01 crc kubenswrapper[4703]: I1011 04:11:01.519728 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"7407f9e5-80e5-4be4-afaa-399672f901d8","Type":"ContainerStarted","Data":"aff07b7c15d4389206f4633ea7b18f0bb40cd36448adff14fe12fcf44a019af5"} Oct 11 04:11:01 crc kubenswrapper[4703]: I1011 04:11:01.519830 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"7407f9e5-80e5-4be4-afaa-399672f901d8","Type":"ContainerStarted","Data":"021cbd10115586b60d975cb9c4b6ac9537e87ca98eef9e4803978e2454bd32d5"} Oct 11 04:11:01 crc kubenswrapper[4703]: I1011 04:11:01.545306 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="485cef99-d5fd-43f4-b3dd-7ecb7272b1f0" path="/var/lib/kubelet/pods/485cef99-d5fd-43f4-b3dd-7ecb7272b1f0/volumes" Oct 11 04:11:01 crc kubenswrapper[4703]: I1011 04:11:01.557490 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-single-0" podStartSLOduration=2.557453626 podStartE2EDuration="2.557453626s" podCreationTimestamp="2025-10-11 04:10:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 04:11:01.554642872 +0000 UTC m=+952.765124814" watchObservedRunningTime="2025-10-11 04:11:01.557453626 +0000 UTC m=+952.767935558" Oct 11 04:11:10 crc kubenswrapper[4703]: I1011 04:11:10.173569 4703 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-single-0" Oct 11 04:11:10 crc kubenswrapper[4703]: I1011 04:11:10.174158 4703 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-single-0" Oct 11 04:11:10 crc kubenswrapper[4703]: I1011 04:11:10.215594 4703 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-single-0" Oct 11 04:11:10 crc kubenswrapper[4703]: I1011 04:11:10.234041 4703 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-single-0" Oct 11 04:11:10 crc kubenswrapper[4703]: I1011 04:11:10.604173 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-single-0" Oct 11 04:11:10 crc kubenswrapper[4703]: I1011 04:11:10.604401 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-single-0" Oct 11 04:11:12 crc kubenswrapper[4703]: I1011 04:11:12.531935 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-single-0" Oct 11 04:11:12 crc kubenswrapper[4703]: I1011 04:11:12.546153 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-single-0" Oct 11 04:11:20 crc kubenswrapper[4703]: I1011 04:11:20.257233 4703 patch_prober.go:28] interesting pod/machine-config-daemon-6b7d5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 11 04:11:20 crc kubenswrapper[4703]: I1011 04:11:20.257674 4703 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6b7d5" podUID="74f832fc-6791-47d6-a9b3-07d923e053dc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 11 04:11:29 crc kubenswrapper[4703]: I1011 04:11:29.452033 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-db-sync-6z8lk"] Oct 11 04:11:29 crc kubenswrapper[4703]: I1011 04:11:29.458443 4703 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-db-sync-6z8lk"] Oct 11 04:11:29 crc kubenswrapper[4703]: I1011 04:11:29.484938 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance3490-account-delete-rnf62"] Oct 11 04:11:29 crc kubenswrapper[4703]: I1011 04:11:29.486361 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance3490-account-delete-rnf62" Oct 11 04:11:29 crc kubenswrapper[4703]: I1011 04:11:29.494221 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance3490-account-delete-rnf62"] Oct 11 04:11:29 crc kubenswrapper[4703]: I1011 04:11:29.541639 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d4dbd298-477d-4565-8e77-f5a1735560f6" path="/var/lib/kubelet/pods/d4dbd298-477d-4565-8e77-f5a1735560f6/volumes" Oct 11 04:11:29 crc kubenswrapper[4703]: I1011 04:11:29.567254 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rzmkf\" (UniqueName: \"kubernetes.io/projected/2f2263eb-cb8a-4462-8a3a-7053c5555a1d-kube-api-access-rzmkf\") pod \"glance3490-account-delete-rnf62\" (UID: \"2f2263eb-cb8a-4462-8a3a-7053c5555a1d\") " pod="glance-kuttl-tests/glance3490-account-delete-rnf62" Oct 11 04:11:29 crc kubenswrapper[4703]: I1011 04:11:29.572988 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Oct 11 04:11:29 crc kubenswrapper[4703]: I1011 04:11:29.573291 4703 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-single-0" podUID="7407f9e5-80e5-4be4-afaa-399672f901d8" containerName="glance-log" containerID="cri-o://aff07b7c15d4389206f4633ea7b18f0bb40cd36448adff14fe12fcf44a019af5" gracePeriod=30 Oct 11 04:11:29 crc kubenswrapper[4703]: I1011 04:11:29.573370 4703 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-single-0" podUID="7407f9e5-80e5-4be4-afaa-399672f901d8" containerName="glance-httpd" containerID="cri-o://2a43002367aede56b4c7061f0b163d76259dca2693e670660fca171e43a8f739" gracePeriod=30 Oct 11 04:11:29 crc kubenswrapper[4703]: I1011 04:11:29.584212 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-single-1"] Oct 11 04:11:29 crc kubenswrapper[4703]: I1011 04:11:29.584595 4703 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-single-1" podUID="5895dcda-da2b-4c63-ae27-8d9a289718a2" containerName="glance-log" containerID="cri-o://5c6003c5567d245fbd340e019bee8f3ab3d9caa6d8ebc3f228d621804fa732a5" gracePeriod=30 Oct 11 04:11:29 crc kubenswrapper[4703]: I1011 04:11:29.584819 4703 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-single-1" podUID="5895dcda-da2b-4c63-ae27-8d9a289718a2" containerName="glance-httpd" containerID="cri-o://6dfb24529618ae483ac5a3938f5bf499fba997d06cc621c5317a788fd55eb49d" gracePeriod=30 Oct 11 04:11:29 crc kubenswrapper[4703]: I1011 04:11:29.625847 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/openstackclient"] Oct 11 04:11:29 crc kubenswrapper[4703]: I1011 04:11:29.626290 4703 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/openstackclient" podUID="d8f5e879-8b5b-45a9-96d5-12c5753b8d6b" containerName="openstackclient" containerID="cri-o://4149f59704f0bba419fbf1e746c27e33c6257421aa50090ad4f6fd26b7161163" gracePeriod=30 Oct 11 04:11:29 crc kubenswrapper[4703]: I1011 04:11:29.668278 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rzmkf\" (UniqueName: \"kubernetes.io/projected/2f2263eb-cb8a-4462-8a3a-7053c5555a1d-kube-api-access-rzmkf\") pod \"glance3490-account-delete-rnf62\" (UID: \"2f2263eb-cb8a-4462-8a3a-7053c5555a1d\") " pod="glance-kuttl-tests/glance3490-account-delete-rnf62" Oct 11 04:11:29 crc kubenswrapper[4703]: I1011 04:11:29.698875 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rzmkf\" (UniqueName: \"kubernetes.io/projected/2f2263eb-cb8a-4462-8a3a-7053c5555a1d-kube-api-access-rzmkf\") pod \"glance3490-account-delete-rnf62\" (UID: \"2f2263eb-cb8a-4462-8a3a-7053c5555a1d\") " pod="glance-kuttl-tests/glance3490-account-delete-rnf62" Oct 11 04:11:29 crc kubenswrapper[4703]: I1011 04:11:29.758925 4703 generic.go:334] "Generic (PLEG): container finished" podID="7407f9e5-80e5-4be4-afaa-399672f901d8" containerID="aff07b7c15d4389206f4633ea7b18f0bb40cd36448adff14fe12fcf44a019af5" exitCode=143 Oct 11 04:11:29 crc kubenswrapper[4703]: I1011 04:11:29.759003 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"7407f9e5-80e5-4be4-afaa-399672f901d8","Type":"ContainerDied","Data":"aff07b7c15d4389206f4633ea7b18f0bb40cd36448adff14fe12fcf44a019af5"} Oct 11 04:11:29 crc kubenswrapper[4703]: I1011 04:11:29.761081 4703 generic.go:334] "Generic (PLEG): container finished" podID="5895dcda-da2b-4c63-ae27-8d9a289718a2" containerID="5c6003c5567d245fbd340e019bee8f3ab3d9caa6d8ebc3f228d621804fa732a5" exitCode=143 Oct 11 04:11:29 crc kubenswrapper[4703]: I1011 04:11:29.761152 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-1" event={"ID":"5895dcda-da2b-4c63-ae27-8d9a289718a2","Type":"ContainerDied","Data":"5c6003c5567d245fbd340e019bee8f3ab3d9caa6d8ebc3f228d621804fa732a5"} Oct 11 04:11:29 crc kubenswrapper[4703]: I1011 04:11:29.762733 4703 generic.go:334] "Generic (PLEG): container finished" podID="d8f5e879-8b5b-45a9-96d5-12c5753b8d6b" containerID="4149f59704f0bba419fbf1e746c27e33c6257421aa50090ad4f6fd26b7161163" exitCode=143 Oct 11 04:11:29 crc kubenswrapper[4703]: I1011 04:11:29.762756 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstackclient" event={"ID":"d8f5e879-8b5b-45a9-96d5-12c5753b8d6b","Type":"ContainerDied","Data":"4149f59704f0bba419fbf1e746c27e33c6257421aa50090ad4f6fd26b7161163"} Oct 11 04:11:29 crc kubenswrapper[4703]: I1011 04:11:29.844040 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance3490-account-delete-rnf62" Oct 11 04:11:30 crc kubenswrapper[4703]: I1011 04:11:30.003111 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/openstackclient" Oct 11 04:11:30 crc kubenswrapper[4703]: I1011 04:11:30.073886 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-scripts\" (UniqueName: \"kubernetes.io/configmap/d8f5e879-8b5b-45a9-96d5-12c5753b8d6b-openstack-scripts\") pod \"d8f5e879-8b5b-45a9-96d5-12c5753b8d6b\" (UID: \"d8f5e879-8b5b-45a9-96d5-12c5753b8d6b\") " Oct 11 04:11:30 crc kubenswrapper[4703]: I1011 04:11:30.073935 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/d8f5e879-8b5b-45a9-96d5-12c5753b8d6b-openstack-config-secret\") pod \"d8f5e879-8b5b-45a9-96d5-12c5753b8d6b\" (UID: \"d8f5e879-8b5b-45a9-96d5-12c5753b8d6b\") " Oct 11 04:11:30 crc kubenswrapper[4703]: I1011 04:11:30.073971 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/d8f5e879-8b5b-45a9-96d5-12c5753b8d6b-openstack-config\") pod \"d8f5e879-8b5b-45a9-96d5-12c5753b8d6b\" (UID: \"d8f5e879-8b5b-45a9-96d5-12c5753b8d6b\") " Oct 11 04:11:30 crc kubenswrapper[4703]: I1011 04:11:30.074110 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-flpmj\" (UniqueName: \"kubernetes.io/projected/d8f5e879-8b5b-45a9-96d5-12c5753b8d6b-kube-api-access-flpmj\") pod \"d8f5e879-8b5b-45a9-96d5-12c5753b8d6b\" (UID: \"d8f5e879-8b5b-45a9-96d5-12c5753b8d6b\") " Oct 11 04:11:30 crc kubenswrapper[4703]: I1011 04:11:30.075250 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d8f5e879-8b5b-45a9-96d5-12c5753b8d6b-openstack-scripts" (OuterVolumeSpecName: "openstack-scripts") pod "d8f5e879-8b5b-45a9-96d5-12c5753b8d6b" (UID: "d8f5e879-8b5b-45a9-96d5-12c5753b8d6b"). InnerVolumeSpecName "openstack-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 04:11:30 crc kubenswrapper[4703]: I1011 04:11:30.084923 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8f5e879-8b5b-45a9-96d5-12c5753b8d6b-kube-api-access-flpmj" (OuterVolumeSpecName: "kube-api-access-flpmj") pod "d8f5e879-8b5b-45a9-96d5-12c5753b8d6b" (UID: "d8f5e879-8b5b-45a9-96d5-12c5753b8d6b"). InnerVolumeSpecName "kube-api-access-flpmj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 04:11:30 crc kubenswrapper[4703]: I1011 04:11:30.095308 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance3490-account-delete-rnf62"] Oct 11 04:11:30 crc kubenswrapper[4703]: I1011 04:11:30.100236 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8f5e879-8b5b-45a9-96d5-12c5753b8d6b-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "d8f5e879-8b5b-45a9-96d5-12c5753b8d6b" (UID: "d8f5e879-8b5b-45a9-96d5-12c5753b8d6b"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 04:11:30 crc kubenswrapper[4703]: I1011 04:11:30.104106 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d8f5e879-8b5b-45a9-96d5-12c5753b8d6b-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "d8f5e879-8b5b-45a9-96d5-12c5753b8d6b" (UID: "d8f5e879-8b5b-45a9-96d5-12c5753b8d6b"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 04:11:30 crc kubenswrapper[4703]: I1011 04:11:30.176120 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-flpmj\" (UniqueName: \"kubernetes.io/projected/d8f5e879-8b5b-45a9-96d5-12c5753b8d6b-kube-api-access-flpmj\") on node \"crc\" DevicePath \"\"" Oct 11 04:11:30 crc kubenswrapper[4703]: I1011 04:11:30.176153 4703 reconciler_common.go:293] "Volume detached for volume \"openstack-scripts\" (UniqueName: \"kubernetes.io/configmap/d8f5e879-8b5b-45a9-96d5-12c5753b8d6b-openstack-scripts\") on node \"crc\" DevicePath \"\"" Oct 11 04:11:30 crc kubenswrapper[4703]: I1011 04:11:30.176162 4703 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/d8f5e879-8b5b-45a9-96d5-12c5753b8d6b-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Oct 11 04:11:30 crc kubenswrapper[4703]: I1011 04:11:30.176171 4703 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/d8f5e879-8b5b-45a9-96d5-12c5753b8d6b-openstack-config\") on node \"crc\" DevicePath \"\"" Oct 11 04:11:30 crc kubenswrapper[4703]: I1011 04:11:30.771111 4703 generic.go:334] "Generic (PLEG): container finished" podID="2f2263eb-cb8a-4462-8a3a-7053c5555a1d" containerID="5f5a7cbd82719763cf8993d75aea233c3a0d82be78bdbd3108dcda952e541729" exitCode=0 Oct 11 04:11:30 crc kubenswrapper[4703]: I1011 04:11:30.771225 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance3490-account-delete-rnf62" event={"ID":"2f2263eb-cb8a-4462-8a3a-7053c5555a1d","Type":"ContainerDied","Data":"5f5a7cbd82719763cf8993d75aea233c3a0d82be78bdbd3108dcda952e541729"} Oct 11 04:11:30 crc kubenswrapper[4703]: I1011 04:11:30.771532 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance3490-account-delete-rnf62" event={"ID":"2f2263eb-cb8a-4462-8a3a-7053c5555a1d","Type":"ContainerStarted","Data":"21f215c67f48b37fe979b353f0b1709d24f9c6471a804c5f92072dbbe9b200c5"} Oct 11 04:11:30 crc kubenswrapper[4703]: I1011 04:11:30.773364 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstackclient" event={"ID":"d8f5e879-8b5b-45a9-96d5-12c5753b8d6b","Type":"ContainerDied","Data":"326f43039dfc5653034cc538019f1d7f2c776984b71403b3f4b674b53b1543aa"} Oct 11 04:11:30 crc kubenswrapper[4703]: I1011 04:11:30.773408 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/openstackclient" Oct 11 04:11:30 crc kubenswrapper[4703]: I1011 04:11:30.773425 4703 scope.go:117] "RemoveContainer" containerID="4149f59704f0bba419fbf1e746c27e33c6257421aa50090ad4f6fd26b7161163" Oct 11 04:11:30 crc kubenswrapper[4703]: I1011 04:11:30.809377 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/openstackclient"] Oct 11 04:11:30 crc kubenswrapper[4703]: I1011 04:11:30.815224 4703 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/openstackclient"] Oct 11 04:11:31 crc kubenswrapper[4703]: I1011 04:11:31.541902 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d8f5e879-8b5b-45a9-96d5-12c5753b8d6b" path="/var/lib/kubelet/pods/d8f5e879-8b5b-45a9-96d5-12c5753b8d6b/volumes" Oct 11 04:11:32 crc kubenswrapper[4703]: I1011 04:11:32.141386 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance3490-account-delete-rnf62" Oct 11 04:11:32 crc kubenswrapper[4703]: I1011 04:11:32.210890 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rzmkf\" (UniqueName: \"kubernetes.io/projected/2f2263eb-cb8a-4462-8a3a-7053c5555a1d-kube-api-access-rzmkf\") pod \"2f2263eb-cb8a-4462-8a3a-7053c5555a1d\" (UID: \"2f2263eb-cb8a-4462-8a3a-7053c5555a1d\") " Oct 11 04:11:32 crc kubenswrapper[4703]: I1011 04:11:32.215685 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2f2263eb-cb8a-4462-8a3a-7053c5555a1d-kube-api-access-rzmkf" (OuterVolumeSpecName: "kube-api-access-rzmkf") pod "2f2263eb-cb8a-4462-8a3a-7053c5555a1d" (UID: "2f2263eb-cb8a-4462-8a3a-7053c5555a1d"). InnerVolumeSpecName "kube-api-access-rzmkf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 04:11:32 crc kubenswrapper[4703]: I1011 04:11:32.312887 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rzmkf\" (UniqueName: \"kubernetes.io/projected/2f2263eb-cb8a-4462-8a3a-7053c5555a1d-kube-api-access-rzmkf\") on node \"crc\" DevicePath \"\"" Oct 11 04:11:32 crc kubenswrapper[4703]: I1011 04:11:32.743505 4703 prober.go:107] "Probe failed" probeType="Readiness" pod="glance-kuttl-tests/glance-default-single-0" podUID="7407f9e5-80e5-4be4-afaa-399672f901d8" containerName="glance-log" probeResult="failure" output="Get \"http://10.217.0.104:9292/healthcheck\": read tcp 10.217.0.2:55888->10.217.0.104:9292: read: connection reset by peer" Oct 11 04:11:32 crc kubenswrapper[4703]: I1011 04:11:32.743505 4703 prober.go:107] "Probe failed" probeType="Readiness" pod="glance-kuttl-tests/glance-default-single-0" podUID="7407f9e5-80e5-4be4-afaa-399672f901d8" containerName="glance-httpd" probeResult="failure" output="Get \"http://10.217.0.104:9292/healthcheck\": read tcp 10.217.0.2:55882->10.217.0.104:9292: read: connection reset by peer" Oct 11 04:11:32 crc kubenswrapper[4703]: I1011 04:11:32.790950 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance3490-account-delete-rnf62" event={"ID":"2f2263eb-cb8a-4462-8a3a-7053c5555a1d","Type":"ContainerDied","Data":"21f215c67f48b37fe979b353f0b1709d24f9c6471a804c5f92072dbbe9b200c5"} Oct 11 04:11:32 crc kubenswrapper[4703]: I1011 04:11:32.790988 4703 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="21f215c67f48b37fe979b353f0b1709d24f9c6471a804c5f92072dbbe9b200c5" Oct 11 04:11:32 crc kubenswrapper[4703]: I1011 04:11:32.791036 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance3490-account-delete-rnf62" Oct 11 04:11:33 crc kubenswrapper[4703]: I1011 04:11:33.195743 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Oct 11 04:11:33 crc kubenswrapper[4703]: I1011 04:11:33.199800 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-1" Oct 11 04:11:33 crc kubenswrapper[4703]: I1011 04:11:33.331347 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"7407f9e5-80e5-4be4-afaa-399672f901d8\" (UID: \"7407f9e5-80e5-4be4-afaa-399672f901d8\") " Oct 11 04:11:33 crc kubenswrapper[4703]: I1011 04:11:33.331821 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"5895dcda-da2b-4c63-ae27-8d9a289718a2\" (UID: \"5895dcda-da2b-4c63-ae27-8d9a289718a2\") " Oct 11 04:11:33 crc kubenswrapper[4703]: I1011 04:11:33.331887 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5895dcda-da2b-4c63-ae27-8d9a289718a2-scripts\") pod \"5895dcda-da2b-4c63-ae27-8d9a289718a2\" (UID: \"5895dcda-da2b-4c63-ae27-8d9a289718a2\") " Oct 11 04:11:33 crc kubenswrapper[4703]: I1011 04:11:33.331923 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7407f9e5-80e5-4be4-afaa-399672f901d8-logs\") pod \"7407f9e5-80e5-4be4-afaa-399672f901d8\" (UID: \"7407f9e5-80e5-4be4-afaa-399672f901d8\") " Oct 11 04:11:33 crc kubenswrapper[4703]: I1011 04:11:33.331943 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/5895dcda-da2b-4c63-ae27-8d9a289718a2-var-locks-brick\") pod \"5895dcda-da2b-4c63-ae27-8d9a289718a2\" (UID: \"5895dcda-da2b-4c63-ae27-8d9a289718a2\") " Oct 11 04:11:33 crc kubenswrapper[4703]: I1011 04:11:33.331976 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5895dcda-da2b-4c63-ae27-8d9a289718a2-logs\") pod \"5895dcda-da2b-4c63-ae27-8d9a289718a2\" (UID: \"5895dcda-da2b-4c63-ae27-8d9a289718a2\") " Oct 11 04:11:33 crc kubenswrapper[4703]: I1011 04:11:33.331998 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/5895dcda-da2b-4c63-ae27-8d9a289718a2-etc-iscsi\") pod \"5895dcda-da2b-4c63-ae27-8d9a289718a2\" (UID: \"5895dcda-da2b-4c63-ae27-8d9a289718a2\") " Oct 11 04:11:33 crc kubenswrapper[4703]: I1011 04:11:33.332027 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/7407f9e5-80e5-4be4-afaa-399672f901d8-run\") pod \"7407f9e5-80e5-4be4-afaa-399672f901d8\" (UID: \"7407f9e5-80e5-4be4-afaa-399672f901d8\") " Oct 11 04:11:33 crc kubenswrapper[4703]: I1011 04:11:33.332093 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7407f9e5-80e5-4be4-afaa-399672f901d8-config-data\") pod \"7407f9e5-80e5-4be4-afaa-399672f901d8\" (UID: \"7407f9e5-80e5-4be4-afaa-399672f901d8\") " Oct 11 04:11:33 crc kubenswrapper[4703]: I1011 04:11:33.332119 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/7407f9e5-80e5-4be4-afaa-399672f901d8-etc-nvme\") pod \"7407f9e5-80e5-4be4-afaa-399672f901d8\" (UID: \"7407f9e5-80e5-4be4-afaa-399672f901d8\") " Oct 11 04:11:33 crc kubenswrapper[4703]: I1011 04:11:33.332143 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5895dcda-da2b-4c63-ae27-8d9a289718a2-httpd-run\") pod \"5895dcda-da2b-4c63-ae27-8d9a289718a2\" (UID: \"5895dcda-da2b-4c63-ae27-8d9a289718a2\") " Oct 11 04:11:33 crc kubenswrapper[4703]: I1011 04:11:33.332164 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/5895dcda-da2b-4c63-ae27-8d9a289718a2-etc-nvme\") pod \"5895dcda-da2b-4c63-ae27-8d9a289718a2\" (UID: \"5895dcda-da2b-4c63-ae27-8d9a289718a2\") " Oct 11 04:11:33 crc kubenswrapper[4703]: I1011 04:11:33.332181 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/5895dcda-da2b-4c63-ae27-8d9a289718a2-dev\") pod \"5895dcda-da2b-4c63-ae27-8d9a289718a2\" (UID: \"5895dcda-da2b-4c63-ae27-8d9a289718a2\") " Oct 11 04:11:33 crc kubenswrapper[4703]: I1011 04:11:33.332189 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5895dcda-da2b-4c63-ae27-8d9a289718a2-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "5895dcda-da2b-4c63-ae27-8d9a289718a2" (UID: "5895dcda-da2b-4c63-ae27-8d9a289718a2"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 11 04:11:33 crc kubenswrapper[4703]: I1011 04:11:33.332204 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n62v7\" (UniqueName: \"kubernetes.io/projected/7407f9e5-80e5-4be4-afaa-399672f901d8-kube-api-access-n62v7\") pod \"7407f9e5-80e5-4be4-afaa-399672f901d8\" (UID: \"7407f9e5-80e5-4be4-afaa-399672f901d8\") " Oct 11 04:11:33 crc kubenswrapper[4703]: I1011 04:11:33.332225 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7407f9e5-80e5-4be4-afaa-399672f901d8-httpd-run\") pod \"7407f9e5-80e5-4be4-afaa-399672f901d8\" (UID: \"7407f9e5-80e5-4be4-afaa-399672f901d8\") " Oct 11 04:11:33 crc kubenswrapper[4703]: I1011 04:11:33.332283 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nphxb\" (UniqueName: \"kubernetes.io/projected/5895dcda-da2b-4c63-ae27-8d9a289718a2-kube-api-access-nphxb\") pod \"5895dcda-da2b-4c63-ae27-8d9a289718a2\" (UID: \"5895dcda-da2b-4c63-ae27-8d9a289718a2\") " Oct 11 04:11:33 crc kubenswrapper[4703]: I1011 04:11:33.332336 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5895dcda-da2b-4c63-ae27-8d9a289718a2-sys\") pod \"5895dcda-da2b-4c63-ae27-8d9a289718a2\" (UID: \"5895dcda-da2b-4c63-ae27-8d9a289718a2\") " Oct 11 04:11:33 crc kubenswrapper[4703]: I1011 04:11:33.332363 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"5895dcda-da2b-4c63-ae27-8d9a289718a2\" (UID: \"5895dcda-da2b-4c63-ae27-8d9a289718a2\") " Oct 11 04:11:33 crc kubenswrapper[4703]: I1011 04:11:33.332382 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7407f9e5-80e5-4be4-afaa-399672f901d8-sys\") pod \"7407f9e5-80e5-4be4-afaa-399672f901d8\" (UID: \"7407f9e5-80e5-4be4-afaa-399672f901d8\") " Oct 11 04:11:33 crc kubenswrapper[4703]: I1011 04:11:33.332407 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7407f9e5-80e5-4be4-afaa-399672f901d8-scripts\") pod \"7407f9e5-80e5-4be4-afaa-399672f901d8\" (UID: \"7407f9e5-80e5-4be4-afaa-399672f901d8\") " Oct 11 04:11:33 crc kubenswrapper[4703]: I1011 04:11:33.332426 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/5895dcda-da2b-4c63-ae27-8d9a289718a2-run\") pod \"5895dcda-da2b-4c63-ae27-8d9a289718a2\" (UID: \"5895dcda-da2b-4c63-ae27-8d9a289718a2\") " Oct 11 04:11:33 crc kubenswrapper[4703]: I1011 04:11:33.332453 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/5895dcda-da2b-4c63-ae27-8d9a289718a2-lib-modules\") pod \"5895dcda-da2b-4c63-ae27-8d9a289718a2\" (UID: \"5895dcda-da2b-4c63-ae27-8d9a289718a2\") " Oct 11 04:11:33 crc kubenswrapper[4703]: I1011 04:11:33.332501 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7407f9e5-80e5-4be4-afaa-399672f901d8-lib-modules\") pod \"7407f9e5-80e5-4be4-afaa-399672f901d8\" (UID: \"7407f9e5-80e5-4be4-afaa-399672f901d8\") " Oct 11 04:11:33 crc kubenswrapper[4703]: I1011 04:11:33.332529 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5895dcda-da2b-4c63-ae27-8d9a289718a2-config-data\") pod \"5895dcda-da2b-4c63-ae27-8d9a289718a2\" (UID: \"5895dcda-da2b-4c63-ae27-8d9a289718a2\") " Oct 11 04:11:33 crc kubenswrapper[4703]: I1011 04:11:33.332548 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/7407f9e5-80e5-4be4-afaa-399672f901d8-dev\") pod \"7407f9e5-80e5-4be4-afaa-399672f901d8\" (UID: \"7407f9e5-80e5-4be4-afaa-399672f901d8\") " Oct 11 04:11:33 crc kubenswrapper[4703]: I1011 04:11:33.332570 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"7407f9e5-80e5-4be4-afaa-399672f901d8\" (UID: \"7407f9e5-80e5-4be4-afaa-399672f901d8\") " Oct 11 04:11:33 crc kubenswrapper[4703]: I1011 04:11:33.332596 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/7407f9e5-80e5-4be4-afaa-399672f901d8-var-locks-brick\") pod \"7407f9e5-80e5-4be4-afaa-399672f901d8\" (UID: \"7407f9e5-80e5-4be4-afaa-399672f901d8\") " Oct 11 04:11:33 crc kubenswrapper[4703]: I1011 04:11:33.332620 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/7407f9e5-80e5-4be4-afaa-399672f901d8-etc-iscsi\") pod \"7407f9e5-80e5-4be4-afaa-399672f901d8\" (UID: \"7407f9e5-80e5-4be4-afaa-399672f901d8\") " Oct 11 04:11:33 crc kubenswrapper[4703]: I1011 04:11:33.332836 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7407f9e5-80e5-4be4-afaa-399672f901d8-logs" (OuterVolumeSpecName: "logs") pod "7407f9e5-80e5-4be4-afaa-399672f901d8" (UID: "7407f9e5-80e5-4be4-afaa-399672f901d8"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 04:11:33 crc kubenswrapper[4703]: I1011 04:11:33.332865 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5895dcda-da2b-4c63-ae27-8d9a289718a2-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "5895dcda-da2b-4c63-ae27-8d9a289718a2" (UID: "5895dcda-da2b-4c63-ae27-8d9a289718a2"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 11 04:11:33 crc kubenswrapper[4703]: I1011 04:11:33.333251 4703 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7407f9e5-80e5-4be4-afaa-399672f901d8-logs\") on node \"crc\" DevicePath \"\"" Oct 11 04:11:33 crc kubenswrapper[4703]: I1011 04:11:33.333270 4703 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/5895dcda-da2b-4c63-ae27-8d9a289718a2-var-locks-brick\") on node \"crc\" DevicePath \"\"" Oct 11 04:11:33 crc kubenswrapper[4703]: I1011 04:11:33.333283 4703 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/5895dcda-da2b-4c63-ae27-8d9a289718a2-etc-iscsi\") on node \"crc\" DevicePath \"\"" Oct 11 04:11:33 crc kubenswrapper[4703]: I1011 04:11:33.333311 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5895dcda-da2b-4c63-ae27-8d9a289718a2-logs" (OuterVolumeSpecName: "logs") pod "5895dcda-da2b-4c63-ae27-8d9a289718a2" (UID: "5895dcda-da2b-4c63-ae27-8d9a289718a2"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 04:11:33 crc kubenswrapper[4703]: I1011 04:11:33.333321 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7407f9e5-80e5-4be4-afaa-399672f901d8-run" (OuterVolumeSpecName: "run") pod "7407f9e5-80e5-4be4-afaa-399672f901d8" (UID: "7407f9e5-80e5-4be4-afaa-399672f901d8"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 11 04:11:33 crc kubenswrapper[4703]: I1011 04:11:33.333349 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7407f9e5-80e5-4be4-afaa-399672f901d8-sys" (OuterVolumeSpecName: "sys") pod "7407f9e5-80e5-4be4-afaa-399672f901d8" (UID: "7407f9e5-80e5-4be4-afaa-399672f901d8"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 11 04:11:33 crc kubenswrapper[4703]: I1011 04:11:33.337193 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5895dcda-da2b-4c63-ae27-8d9a289718a2-scripts" (OuterVolumeSpecName: "scripts") pod "5895dcda-da2b-4c63-ae27-8d9a289718a2" (UID: "5895dcda-da2b-4c63-ae27-8d9a289718a2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 04:11:33 crc kubenswrapper[4703]: I1011 04:11:33.337275 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5895dcda-da2b-4c63-ae27-8d9a289718a2-dev" (OuterVolumeSpecName: "dev") pod "5895dcda-da2b-4c63-ae27-8d9a289718a2" (UID: "5895dcda-da2b-4c63-ae27-8d9a289718a2"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 11 04:11:33 crc kubenswrapper[4703]: I1011 04:11:33.337741 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "glance") pod "7407f9e5-80e5-4be4-afaa-399672f901d8" (UID: "7407f9e5-80e5-4be4-afaa-399672f901d8"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 11 04:11:33 crc kubenswrapper[4703]: I1011 04:11:33.337908 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "glance-cache") pod "5895dcda-da2b-4c63-ae27-8d9a289718a2" (UID: "5895dcda-da2b-4c63-ae27-8d9a289718a2"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 11 04:11:33 crc kubenswrapper[4703]: I1011 04:11:33.337986 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7407f9e5-80e5-4be4-afaa-399672f901d8-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "7407f9e5-80e5-4be4-afaa-399672f901d8" (UID: "7407f9e5-80e5-4be4-afaa-399672f901d8"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 11 04:11:33 crc kubenswrapper[4703]: I1011 04:11:33.338029 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7407f9e5-80e5-4be4-afaa-399672f901d8-dev" (OuterVolumeSpecName: "dev") pod "7407f9e5-80e5-4be4-afaa-399672f901d8" (UID: "7407f9e5-80e5-4be4-afaa-399672f901d8"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 11 04:11:33 crc kubenswrapper[4703]: I1011 04:11:33.338351 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "glance") pod "5895dcda-da2b-4c63-ae27-8d9a289718a2" (UID: "5895dcda-da2b-4c63-ae27-8d9a289718a2"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 11 04:11:33 crc kubenswrapper[4703]: I1011 04:11:33.338501 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7407f9e5-80e5-4be4-afaa-399672f901d8-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "7407f9e5-80e5-4be4-afaa-399672f901d8" (UID: "7407f9e5-80e5-4be4-afaa-399672f901d8"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 11 04:11:33 crc kubenswrapper[4703]: I1011 04:11:33.338573 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5895dcda-da2b-4c63-ae27-8d9a289718a2-sys" (OuterVolumeSpecName: "sys") pod "5895dcda-da2b-4c63-ae27-8d9a289718a2" (UID: "5895dcda-da2b-4c63-ae27-8d9a289718a2"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 11 04:11:33 crc kubenswrapper[4703]: I1011 04:11:33.338614 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5895dcda-da2b-4c63-ae27-8d9a289718a2-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "5895dcda-da2b-4c63-ae27-8d9a289718a2" (UID: "5895dcda-da2b-4c63-ae27-8d9a289718a2"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 11 04:11:33 crc kubenswrapper[4703]: I1011 04:11:33.338629 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5895dcda-da2b-4c63-ae27-8d9a289718a2-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "5895dcda-da2b-4c63-ae27-8d9a289718a2" (UID: "5895dcda-da2b-4c63-ae27-8d9a289718a2"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 04:11:33 crc kubenswrapper[4703]: I1011 04:11:33.338650 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5895dcda-da2b-4c63-ae27-8d9a289718a2-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "5895dcda-da2b-4c63-ae27-8d9a289718a2" (UID: "5895dcda-da2b-4c63-ae27-8d9a289718a2"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 11 04:11:33 crc kubenswrapper[4703]: I1011 04:11:33.338687 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5895dcda-da2b-4c63-ae27-8d9a289718a2-run" (OuterVolumeSpecName: "run") pod "5895dcda-da2b-4c63-ae27-8d9a289718a2" (UID: "5895dcda-da2b-4c63-ae27-8d9a289718a2"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 11 04:11:33 crc kubenswrapper[4703]: I1011 04:11:33.338690 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7407f9e5-80e5-4be4-afaa-399672f901d8-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "7407f9e5-80e5-4be4-afaa-399672f901d8" (UID: "7407f9e5-80e5-4be4-afaa-399672f901d8"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 11 04:11:33 crc kubenswrapper[4703]: I1011 04:11:33.338725 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7407f9e5-80e5-4be4-afaa-399672f901d8-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "7407f9e5-80e5-4be4-afaa-399672f901d8" (UID: "7407f9e5-80e5-4be4-afaa-399672f901d8"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 11 04:11:33 crc kubenswrapper[4703]: I1011 04:11:33.338833 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7407f9e5-80e5-4be4-afaa-399672f901d8-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "7407f9e5-80e5-4be4-afaa-399672f901d8" (UID: "7407f9e5-80e5-4be4-afaa-399672f901d8"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 04:11:33 crc kubenswrapper[4703]: I1011 04:11:33.339770 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "glance-cache") pod "7407f9e5-80e5-4be4-afaa-399672f901d8" (UID: "7407f9e5-80e5-4be4-afaa-399672f901d8"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 11 04:11:33 crc kubenswrapper[4703]: I1011 04:11:33.341172 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7407f9e5-80e5-4be4-afaa-399672f901d8-kube-api-access-n62v7" (OuterVolumeSpecName: "kube-api-access-n62v7") pod "7407f9e5-80e5-4be4-afaa-399672f901d8" (UID: "7407f9e5-80e5-4be4-afaa-399672f901d8"). InnerVolumeSpecName "kube-api-access-n62v7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 04:11:33 crc kubenswrapper[4703]: I1011 04:11:33.341359 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7407f9e5-80e5-4be4-afaa-399672f901d8-scripts" (OuterVolumeSpecName: "scripts") pod "7407f9e5-80e5-4be4-afaa-399672f901d8" (UID: "7407f9e5-80e5-4be4-afaa-399672f901d8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 04:11:33 crc kubenswrapper[4703]: I1011 04:11:33.342348 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5895dcda-da2b-4c63-ae27-8d9a289718a2-kube-api-access-nphxb" (OuterVolumeSpecName: "kube-api-access-nphxb") pod "5895dcda-da2b-4c63-ae27-8d9a289718a2" (UID: "5895dcda-da2b-4c63-ae27-8d9a289718a2"). InnerVolumeSpecName "kube-api-access-nphxb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 04:11:33 crc kubenswrapper[4703]: I1011 04:11:33.382768 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7407f9e5-80e5-4be4-afaa-399672f901d8-config-data" (OuterVolumeSpecName: "config-data") pod "7407f9e5-80e5-4be4-afaa-399672f901d8" (UID: "7407f9e5-80e5-4be4-afaa-399672f901d8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 04:11:33 crc kubenswrapper[4703]: I1011 04:11:33.396131 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5895dcda-da2b-4c63-ae27-8d9a289718a2-config-data" (OuterVolumeSpecName: "config-data") pod "5895dcda-da2b-4c63-ae27-8d9a289718a2" (UID: "5895dcda-da2b-4c63-ae27-8d9a289718a2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 04:11:33 crc kubenswrapper[4703]: I1011 04:11:33.436074 4703 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5895dcda-da2b-4c63-ae27-8d9a289718a2-logs\") on node \"crc\" DevicePath \"\"" Oct 11 04:11:33 crc kubenswrapper[4703]: I1011 04:11:33.436105 4703 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/7407f9e5-80e5-4be4-afaa-399672f901d8-run\") on node \"crc\" DevicePath \"\"" Oct 11 04:11:33 crc kubenswrapper[4703]: I1011 04:11:33.436122 4703 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7407f9e5-80e5-4be4-afaa-399672f901d8-config-data\") on node \"crc\" DevicePath \"\"" Oct 11 04:11:33 crc kubenswrapper[4703]: I1011 04:11:33.436131 4703 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/7407f9e5-80e5-4be4-afaa-399672f901d8-etc-nvme\") on node \"crc\" DevicePath \"\"" Oct 11 04:11:33 crc kubenswrapper[4703]: I1011 04:11:33.436142 4703 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5895dcda-da2b-4c63-ae27-8d9a289718a2-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 11 04:11:33 crc kubenswrapper[4703]: I1011 04:11:33.436150 4703 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/5895dcda-da2b-4c63-ae27-8d9a289718a2-etc-nvme\") on node \"crc\" DevicePath \"\"" Oct 11 04:11:33 crc kubenswrapper[4703]: I1011 04:11:33.436161 4703 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/5895dcda-da2b-4c63-ae27-8d9a289718a2-dev\") on node \"crc\" DevicePath \"\"" Oct 11 04:11:33 crc kubenswrapper[4703]: I1011 04:11:33.436170 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n62v7\" (UniqueName: \"kubernetes.io/projected/7407f9e5-80e5-4be4-afaa-399672f901d8-kube-api-access-n62v7\") on node \"crc\" DevicePath \"\"" Oct 11 04:11:33 crc kubenswrapper[4703]: I1011 04:11:33.436179 4703 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7407f9e5-80e5-4be4-afaa-399672f901d8-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 11 04:11:33 crc kubenswrapper[4703]: I1011 04:11:33.436187 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nphxb\" (UniqueName: \"kubernetes.io/projected/5895dcda-da2b-4c63-ae27-8d9a289718a2-kube-api-access-nphxb\") on node \"crc\" DevicePath \"\"" Oct 11 04:11:33 crc kubenswrapper[4703]: I1011 04:11:33.436199 4703 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5895dcda-da2b-4c63-ae27-8d9a289718a2-sys\") on node \"crc\" DevicePath \"\"" Oct 11 04:11:33 crc kubenswrapper[4703]: I1011 04:11:33.436226 4703 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Oct 11 04:11:33 crc kubenswrapper[4703]: I1011 04:11:33.436242 4703 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7407f9e5-80e5-4be4-afaa-399672f901d8-sys\") on node \"crc\" DevicePath \"\"" Oct 11 04:11:33 crc kubenswrapper[4703]: I1011 04:11:33.436252 4703 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7407f9e5-80e5-4be4-afaa-399672f901d8-scripts\") on node \"crc\" DevicePath \"\"" Oct 11 04:11:33 crc kubenswrapper[4703]: I1011 04:11:33.436261 4703 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/5895dcda-da2b-4c63-ae27-8d9a289718a2-run\") on node \"crc\" DevicePath \"\"" Oct 11 04:11:33 crc kubenswrapper[4703]: I1011 04:11:33.436271 4703 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/5895dcda-da2b-4c63-ae27-8d9a289718a2-lib-modules\") on node \"crc\" DevicePath \"\"" Oct 11 04:11:33 crc kubenswrapper[4703]: I1011 04:11:33.436282 4703 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7407f9e5-80e5-4be4-afaa-399672f901d8-lib-modules\") on node \"crc\" DevicePath \"\"" Oct 11 04:11:33 crc kubenswrapper[4703]: I1011 04:11:33.436290 4703 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5895dcda-da2b-4c63-ae27-8d9a289718a2-config-data\") on node \"crc\" DevicePath \"\"" Oct 11 04:11:33 crc kubenswrapper[4703]: I1011 04:11:33.436300 4703 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/7407f9e5-80e5-4be4-afaa-399672f901d8-dev\") on node \"crc\" DevicePath \"\"" Oct 11 04:11:33 crc kubenswrapper[4703]: I1011 04:11:33.436317 4703 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Oct 11 04:11:33 crc kubenswrapper[4703]: I1011 04:11:33.436327 4703 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/7407f9e5-80e5-4be4-afaa-399672f901d8-var-locks-brick\") on node \"crc\" DevicePath \"\"" Oct 11 04:11:33 crc kubenswrapper[4703]: I1011 04:11:33.436335 4703 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/7407f9e5-80e5-4be4-afaa-399672f901d8-etc-iscsi\") on node \"crc\" DevicePath \"\"" Oct 11 04:11:33 crc kubenswrapper[4703]: I1011 04:11:33.436351 4703 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Oct 11 04:11:33 crc kubenswrapper[4703]: I1011 04:11:33.436366 4703 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Oct 11 04:11:33 crc kubenswrapper[4703]: I1011 04:11:33.436377 4703 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5895dcda-da2b-4c63-ae27-8d9a289718a2-scripts\") on node \"crc\" DevicePath \"\"" Oct 11 04:11:33 crc kubenswrapper[4703]: I1011 04:11:33.459626 4703 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Oct 11 04:11:33 crc kubenswrapper[4703]: I1011 04:11:33.461584 4703 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Oct 11 04:11:33 crc kubenswrapper[4703]: I1011 04:11:33.462035 4703 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Oct 11 04:11:33 crc kubenswrapper[4703]: I1011 04:11:33.462163 4703 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Oct 11 04:11:33 crc kubenswrapper[4703]: I1011 04:11:33.537380 4703 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Oct 11 04:11:33 crc kubenswrapper[4703]: I1011 04:11:33.537406 4703 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Oct 11 04:11:33 crc kubenswrapper[4703]: I1011 04:11:33.537414 4703 reconciler_common.go:293] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Oct 11 04:11:33 crc kubenswrapper[4703]: I1011 04:11:33.537422 4703 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Oct 11 04:11:33 crc kubenswrapper[4703]: I1011 04:11:33.800149 4703 generic.go:334] "Generic (PLEG): container finished" podID="7407f9e5-80e5-4be4-afaa-399672f901d8" containerID="2a43002367aede56b4c7061f0b163d76259dca2693e670660fca171e43a8f739" exitCode=0 Oct 11 04:11:33 crc kubenswrapper[4703]: I1011 04:11:33.800209 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"7407f9e5-80e5-4be4-afaa-399672f901d8","Type":"ContainerDied","Data":"2a43002367aede56b4c7061f0b163d76259dca2693e670660fca171e43a8f739"} Oct 11 04:11:33 crc kubenswrapper[4703]: I1011 04:11:33.800234 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"7407f9e5-80e5-4be4-afaa-399672f901d8","Type":"ContainerDied","Data":"021cbd10115586b60d975cb9c4b6ac9537e87ca98eef9e4803978e2454bd32d5"} Oct 11 04:11:33 crc kubenswrapper[4703]: I1011 04:11:33.800251 4703 scope.go:117] "RemoveContainer" containerID="2a43002367aede56b4c7061f0b163d76259dca2693e670660fca171e43a8f739" Oct 11 04:11:33 crc kubenswrapper[4703]: I1011 04:11:33.800370 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Oct 11 04:11:33 crc kubenswrapper[4703]: I1011 04:11:33.805455 4703 generic.go:334] "Generic (PLEG): container finished" podID="5895dcda-da2b-4c63-ae27-8d9a289718a2" containerID="6dfb24529618ae483ac5a3938f5bf499fba997d06cc621c5317a788fd55eb49d" exitCode=0 Oct 11 04:11:33 crc kubenswrapper[4703]: I1011 04:11:33.805549 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-1" Oct 11 04:11:33 crc kubenswrapper[4703]: I1011 04:11:33.805795 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-1" event={"ID":"5895dcda-da2b-4c63-ae27-8d9a289718a2","Type":"ContainerDied","Data":"6dfb24529618ae483ac5a3938f5bf499fba997d06cc621c5317a788fd55eb49d"} Oct 11 04:11:33 crc kubenswrapper[4703]: I1011 04:11:33.805917 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-1" event={"ID":"5895dcda-da2b-4c63-ae27-8d9a289718a2","Type":"ContainerDied","Data":"68b184972cdc14279ac3c34c9bf15dda1de8dfa6797950a556262c51532ad65b"} Oct 11 04:11:33 crc kubenswrapper[4703]: I1011 04:11:33.831757 4703 scope.go:117] "RemoveContainer" containerID="aff07b7c15d4389206f4633ea7b18f0bb40cd36448adff14fe12fcf44a019af5" Oct 11 04:11:33 crc kubenswrapper[4703]: I1011 04:11:33.831910 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Oct 11 04:11:33 crc kubenswrapper[4703]: I1011 04:11:33.836259 4703 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Oct 11 04:11:33 crc kubenswrapper[4703]: I1011 04:11:33.848355 4703 scope.go:117] "RemoveContainer" containerID="2a43002367aede56b4c7061f0b163d76259dca2693e670660fca171e43a8f739" Oct 11 04:11:33 crc kubenswrapper[4703]: E1011 04:11:33.848859 4703 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2a43002367aede56b4c7061f0b163d76259dca2693e670660fca171e43a8f739\": container with ID starting with 2a43002367aede56b4c7061f0b163d76259dca2693e670660fca171e43a8f739 not found: ID does not exist" containerID="2a43002367aede56b4c7061f0b163d76259dca2693e670660fca171e43a8f739" Oct 11 04:11:33 crc kubenswrapper[4703]: I1011 04:11:33.848897 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2a43002367aede56b4c7061f0b163d76259dca2693e670660fca171e43a8f739"} err="failed to get container status \"2a43002367aede56b4c7061f0b163d76259dca2693e670660fca171e43a8f739\": rpc error: code = NotFound desc = could not find container \"2a43002367aede56b4c7061f0b163d76259dca2693e670660fca171e43a8f739\": container with ID starting with 2a43002367aede56b4c7061f0b163d76259dca2693e670660fca171e43a8f739 not found: ID does not exist" Oct 11 04:11:33 crc kubenswrapper[4703]: I1011 04:11:33.848922 4703 scope.go:117] "RemoveContainer" containerID="aff07b7c15d4389206f4633ea7b18f0bb40cd36448adff14fe12fcf44a019af5" Oct 11 04:11:33 crc kubenswrapper[4703]: E1011 04:11:33.849296 4703 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aff07b7c15d4389206f4633ea7b18f0bb40cd36448adff14fe12fcf44a019af5\": container with ID starting with aff07b7c15d4389206f4633ea7b18f0bb40cd36448adff14fe12fcf44a019af5 not found: ID does not exist" containerID="aff07b7c15d4389206f4633ea7b18f0bb40cd36448adff14fe12fcf44a019af5" Oct 11 04:11:33 crc kubenswrapper[4703]: I1011 04:11:33.849324 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aff07b7c15d4389206f4633ea7b18f0bb40cd36448adff14fe12fcf44a019af5"} err="failed to get container status \"aff07b7c15d4389206f4633ea7b18f0bb40cd36448adff14fe12fcf44a019af5\": rpc error: code = NotFound desc = could not find container \"aff07b7c15d4389206f4633ea7b18f0bb40cd36448adff14fe12fcf44a019af5\": container with ID starting with aff07b7c15d4389206f4633ea7b18f0bb40cd36448adff14fe12fcf44a019af5 not found: ID does not exist" Oct 11 04:11:33 crc kubenswrapper[4703]: I1011 04:11:33.849342 4703 scope.go:117] "RemoveContainer" containerID="6dfb24529618ae483ac5a3938f5bf499fba997d06cc621c5317a788fd55eb49d" Oct 11 04:11:33 crc kubenswrapper[4703]: I1011 04:11:33.863057 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-single-1"] Oct 11 04:11:33 crc kubenswrapper[4703]: I1011 04:11:33.879583 4703 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-single-1"] Oct 11 04:11:33 crc kubenswrapper[4703]: I1011 04:11:33.884901 4703 scope.go:117] "RemoveContainer" containerID="5c6003c5567d245fbd340e019bee8f3ab3d9caa6d8ebc3f228d621804fa732a5" Oct 11 04:11:33 crc kubenswrapper[4703]: I1011 04:11:33.899357 4703 scope.go:117] "RemoveContainer" containerID="6dfb24529618ae483ac5a3938f5bf499fba997d06cc621c5317a788fd55eb49d" Oct 11 04:11:33 crc kubenswrapper[4703]: E1011 04:11:33.899726 4703 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6dfb24529618ae483ac5a3938f5bf499fba997d06cc621c5317a788fd55eb49d\": container with ID starting with 6dfb24529618ae483ac5a3938f5bf499fba997d06cc621c5317a788fd55eb49d not found: ID does not exist" containerID="6dfb24529618ae483ac5a3938f5bf499fba997d06cc621c5317a788fd55eb49d" Oct 11 04:11:33 crc kubenswrapper[4703]: I1011 04:11:33.899772 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6dfb24529618ae483ac5a3938f5bf499fba997d06cc621c5317a788fd55eb49d"} err="failed to get container status \"6dfb24529618ae483ac5a3938f5bf499fba997d06cc621c5317a788fd55eb49d\": rpc error: code = NotFound desc = could not find container \"6dfb24529618ae483ac5a3938f5bf499fba997d06cc621c5317a788fd55eb49d\": container with ID starting with 6dfb24529618ae483ac5a3938f5bf499fba997d06cc621c5317a788fd55eb49d not found: ID does not exist" Oct 11 04:11:33 crc kubenswrapper[4703]: I1011 04:11:33.899800 4703 scope.go:117] "RemoveContainer" containerID="5c6003c5567d245fbd340e019bee8f3ab3d9caa6d8ebc3f228d621804fa732a5" Oct 11 04:11:33 crc kubenswrapper[4703]: E1011 04:11:33.900138 4703 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5c6003c5567d245fbd340e019bee8f3ab3d9caa6d8ebc3f228d621804fa732a5\": container with ID starting with 5c6003c5567d245fbd340e019bee8f3ab3d9caa6d8ebc3f228d621804fa732a5 not found: ID does not exist" containerID="5c6003c5567d245fbd340e019bee8f3ab3d9caa6d8ebc3f228d621804fa732a5" Oct 11 04:11:33 crc kubenswrapper[4703]: I1011 04:11:33.900159 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c6003c5567d245fbd340e019bee8f3ab3d9caa6d8ebc3f228d621804fa732a5"} err="failed to get container status \"5c6003c5567d245fbd340e019bee8f3ab3d9caa6d8ebc3f228d621804fa732a5\": rpc error: code = NotFound desc = could not find container \"5c6003c5567d245fbd340e019bee8f3ab3d9caa6d8ebc3f228d621804fa732a5\": container with ID starting with 5c6003c5567d245fbd340e019bee8f3ab3d9caa6d8ebc3f228d621804fa732a5 not found: ID does not exist" Oct 11 04:11:34 crc kubenswrapper[4703]: I1011 04:11:34.508599 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-db-create-87jj2"] Oct 11 04:11:34 crc kubenswrapper[4703]: I1011 04:11:34.527452 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-3490-account-create-4pwmz"] Oct 11 04:11:34 crc kubenswrapper[4703]: I1011 04:11:34.536969 4703 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-db-create-87jj2"] Oct 11 04:11:34 crc kubenswrapper[4703]: I1011 04:11:34.543455 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance3490-account-delete-rnf62"] Oct 11 04:11:34 crc kubenswrapper[4703]: I1011 04:11:34.549601 4703 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-3490-account-create-4pwmz"] Oct 11 04:11:34 crc kubenswrapper[4703]: I1011 04:11:34.555651 4703 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance3490-account-delete-rnf62"] Oct 11 04:11:34 crc kubenswrapper[4703]: I1011 04:11:34.677254 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-db-create-j6pwn"] Oct 11 04:11:34 crc kubenswrapper[4703]: E1011 04:11:34.677520 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7407f9e5-80e5-4be4-afaa-399672f901d8" containerName="glance-httpd" Oct 11 04:11:34 crc kubenswrapper[4703]: I1011 04:11:34.677531 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="7407f9e5-80e5-4be4-afaa-399672f901d8" containerName="glance-httpd" Oct 11 04:11:34 crc kubenswrapper[4703]: E1011 04:11:34.677546 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f2263eb-cb8a-4462-8a3a-7053c5555a1d" containerName="mariadb-account-delete" Oct 11 04:11:34 crc kubenswrapper[4703]: I1011 04:11:34.677554 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f2263eb-cb8a-4462-8a3a-7053c5555a1d" containerName="mariadb-account-delete" Oct 11 04:11:34 crc kubenswrapper[4703]: E1011 04:11:34.677567 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5895dcda-da2b-4c63-ae27-8d9a289718a2" containerName="glance-log" Oct 11 04:11:34 crc kubenswrapper[4703]: I1011 04:11:34.677574 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="5895dcda-da2b-4c63-ae27-8d9a289718a2" containerName="glance-log" Oct 11 04:11:34 crc kubenswrapper[4703]: E1011 04:11:34.677586 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5895dcda-da2b-4c63-ae27-8d9a289718a2" containerName="glance-httpd" Oct 11 04:11:34 crc kubenswrapper[4703]: I1011 04:11:34.677592 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="5895dcda-da2b-4c63-ae27-8d9a289718a2" containerName="glance-httpd" Oct 11 04:11:34 crc kubenswrapper[4703]: E1011 04:11:34.677602 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7407f9e5-80e5-4be4-afaa-399672f901d8" containerName="glance-log" Oct 11 04:11:34 crc kubenswrapper[4703]: I1011 04:11:34.677607 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="7407f9e5-80e5-4be4-afaa-399672f901d8" containerName="glance-log" Oct 11 04:11:34 crc kubenswrapper[4703]: E1011 04:11:34.677621 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8f5e879-8b5b-45a9-96d5-12c5753b8d6b" containerName="openstackclient" Oct 11 04:11:34 crc kubenswrapper[4703]: I1011 04:11:34.677629 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8f5e879-8b5b-45a9-96d5-12c5753b8d6b" containerName="openstackclient" Oct 11 04:11:34 crc kubenswrapper[4703]: I1011 04:11:34.677758 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="5895dcda-da2b-4c63-ae27-8d9a289718a2" containerName="glance-log" Oct 11 04:11:34 crc kubenswrapper[4703]: I1011 04:11:34.677770 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8f5e879-8b5b-45a9-96d5-12c5753b8d6b" containerName="openstackclient" Oct 11 04:11:34 crc kubenswrapper[4703]: I1011 04:11:34.677779 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f2263eb-cb8a-4462-8a3a-7053c5555a1d" containerName="mariadb-account-delete" Oct 11 04:11:34 crc kubenswrapper[4703]: I1011 04:11:34.677787 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="5895dcda-da2b-4c63-ae27-8d9a289718a2" containerName="glance-httpd" Oct 11 04:11:34 crc kubenswrapper[4703]: I1011 04:11:34.677792 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="7407f9e5-80e5-4be4-afaa-399672f901d8" containerName="glance-httpd" Oct 11 04:11:34 crc kubenswrapper[4703]: I1011 04:11:34.677806 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="7407f9e5-80e5-4be4-afaa-399672f901d8" containerName="glance-log" Oct 11 04:11:34 crc kubenswrapper[4703]: I1011 04:11:34.678249 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-create-j6pwn" Oct 11 04:11:34 crc kubenswrapper[4703]: I1011 04:11:34.689997 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-create-j6pwn"] Oct 11 04:11:34 crc kubenswrapper[4703]: I1011 04:11:34.768252 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wcv8s\" (UniqueName: \"kubernetes.io/projected/804cac9a-99d0-40cc-9251-627acf580d2a-kube-api-access-wcv8s\") pod \"glance-db-create-j6pwn\" (UID: \"804cac9a-99d0-40cc-9251-627acf580d2a\") " pod="glance-kuttl-tests/glance-db-create-j6pwn" Oct 11 04:11:34 crc kubenswrapper[4703]: I1011 04:11:34.870409 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wcv8s\" (UniqueName: \"kubernetes.io/projected/804cac9a-99d0-40cc-9251-627acf580d2a-kube-api-access-wcv8s\") pod \"glance-db-create-j6pwn\" (UID: \"804cac9a-99d0-40cc-9251-627acf580d2a\") " pod="glance-kuttl-tests/glance-db-create-j6pwn" Oct 11 04:11:34 crc kubenswrapper[4703]: I1011 04:11:34.895397 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wcv8s\" (UniqueName: \"kubernetes.io/projected/804cac9a-99d0-40cc-9251-627acf580d2a-kube-api-access-wcv8s\") pod \"glance-db-create-j6pwn\" (UID: \"804cac9a-99d0-40cc-9251-627acf580d2a\") " pod="glance-kuttl-tests/glance-db-create-j6pwn" Oct 11 04:11:35 crc kubenswrapper[4703]: I1011 04:11:34.999849 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-create-j6pwn" Oct 11 04:11:35 crc kubenswrapper[4703]: I1011 04:11:35.282378 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-create-j6pwn"] Oct 11 04:11:35 crc kubenswrapper[4703]: I1011 04:11:35.547068 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0eb416e8-1024-41ef-9551-d2635269ca21" path="/var/lib/kubelet/pods/0eb416e8-1024-41ef-9551-d2635269ca21/volumes" Oct 11 04:11:35 crc kubenswrapper[4703]: I1011 04:11:35.548575 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2f2263eb-cb8a-4462-8a3a-7053c5555a1d" path="/var/lib/kubelet/pods/2f2263eb-cb8a-4462-8a3a-7053c5555a1d/volumes" Oct 11 04:11:35 crc kubenswrapper[4703]: I1011 04:11:35.549761 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5895dcda-da2b-4c63-ae27-8d9a289718a2" path="/var/lib/kubelet/pods/5895dcda-da2b-4c63-ae27-8d9a289718a2/volumes" Oct 11 04:11:35 crc kubenswrapper[4703]: I1011 04:11:35.551908 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7407f9e5-80e5-4be4-afaa-399672f901d8" path="/var/lib/kubelet/pods/7407f9e5-80e5-4be4-afaa-399672f901d8/volumes" Oct 11 04:11:35 crc kubenswrapper[4703]: I1011 04:11:35.553228 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e404d3d1-1536-4de8-b8a2-e13e7d61328b" path="/var/lib/kubelet/pods/e404d3d1-1536-4de8-b8a2-e13e7d61328b/volumes" Oct 11 04:11:35 crc kubenswrapper[4703]: I1011 04:11:35.833837 4703 generic.go:334] "Generic (PLEG): container finished" podID="804cac9a-99d0-40cc-9251-627acf580d2a" containerID="e3423e355e537e4e22934b87f1e3d77a685f1ba3e6a5130923a582cf723e90a2" exitCode=0 Oct 11 04:11:35 crc kubenswrapper[4703]: I1011 04:11:35.833895 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-create-j6pwn" event={"ID":"804cac9a-99d0-40cc-9251-627acf580d2a","Type":"ContainerDied","Data":"e3423e355e537e4e22934b87f1e3d77a685f1ba3e6a5130923a582cf723e90a2"} Oct 11 04:11:35 crc kubenswrapper[4703]: I1011 04:11:35.833960 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-create-j6pwn" event={"ID":"804cac9a-99d0-40cc-9251-627acf580d2a","Type":"ContainerStarted","Data":"7eb0fa6930161b0319af3f094e94c9bc50cf747d2ed4d2a7d9f404f9302d352d"} Oct 11 04:11:38 crc kubenswrapper[4703]: I1011 04:11:37.191789 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-create-j6pwn" Oct 11 04:11:38 crc kubenswrapper[4703]: I1011 04:11:37.310720 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wcv8s\" (UniqueName: \"kubernetes.io/projected/804cac9a-99d0-40cc-9251-627acf580d2a-kube-api-access-wcv8s\") pod \"804cac9a-99d0-40cc-9251-627acf580d2a\" (UID: \"804cac9a-99d0-40cc-9251-627acf580d2a\") " Oct 11 04:11:38 crc kubenswrapper[4703]: I1011 04:11:37.315834 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/804cac9a-99d0-40cc-9251-627acf580d2a-kube-api-access-wcv8s" (OuterVolumeSpecName: "kube-api-access-wcv8s") pod "804cac9a-99d0-40cc-9251-627acf580d2a" (UID: "804cac9a-99d0-40cc-9251-627acf580d2a"). InnerVolumeSpecName "kube-api-access-wcv8s". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 04:11:38 crc kubenswrapper[4703]: I1011 04:11:37.412645 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wcv8s\" (UniqueName: \"kubernetes.io/projected/804cac9a-99d0-40cc-9251-627acf580d2a-kube-api-access-wcv8s\") on node \"crc\" DevicePath \"\"" Oct 11 04:11:38 crc kubenswrapper[4703]: I1011 04:11:37.855822 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-create-j6pwn" event={"ID":"804cac9a-99d0-40cc-9251-627acf580d2a","Type":"ContainerDied","Data":"7eb0fa6930161b0319af3f094e94c9bc50cf747d2ed4d2a7d9f404f9302d352d"} Oct 11 04:11:38 crc kubenswrapper[4703]: I1011 04:11:37.855859 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-create-j6pwn" Oct 11 04:11:38 crc kubenswrapper[4703]: I1011 04:11:37.855860 4703 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7eb0fa6930161b0319af3f094e94c9bc50cf747d2ed4d2a7d9f404f9302d352d" Oct 11 04:11:44 crc kubenswrapper[4703]: I1011 04:11:44.719917 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-2e4d-account-create-pgl4w"] Oct 11 04:11:44 crc kubenswrapper[4703]: E1011 04:11:44.720877 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="804cac9a-99d0-40cc-9251-627acf580d2a" containerName="mariadb-database-create" Oct 11 04:11:44 crc kubenswrapper[4703]: I1011 04:11:44.720896 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="804cac9a-99d0-40cc-9251-627acf580d2a" containerName="mariadb-database-create" Oct 11 04:11:44 crc kubenswrapper[4703]: I1011 04:11:44.721084 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="804cac9a-99d0-40cc-9251-627acf580d2a" containerName="mariadb-database-create" Oct 11 04:11:44 crc kubenswrapper[4703]: I1011 04:11:44.721631 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-2e4d-account-create-pgl4w" Oct 11 04:11:44 crc kubenswrapper[4703]: I1011 04:11:44.727368 4703 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-db-secret" Oct 11 04:11:44 crc kubenswrapper[4703]: I1011 04:11:44.732626 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-2e4d-account-create-pgl4w"] Oct 11 04:11:44 crc kubenswrapper[4703]: I1011 04:11:44.826932 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-swq2v\" (UniqueName: \"kubernetes.io/projected/3afca6b0-45f3-460d-864b-b0de72f2e817-kube-api-access-swq2v\") pod \"glance-2e4d-account-create-pgl4w\" (UID: \"3afca6b0-45f3-460d-864b-b0de72f2e817\") " pod="glance-kuttl-tests/glance-2e4d-account-create-pgl4w" Oct 11 04:11:44 crc kubenswrapper[4703]: I1011 04:11:44.941038 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-swq2v\" (UniqueName: \"kubernetes.io/projected/3afca6b0-45f3-460d-864b-b0de72f2e817-kube-api-access-swq2v\") pod \"glance-2e4d-account-create-pgl4w\" (UID: \"3afca6b0-45f3-460d-864b-b0de72f2e817\") " pod="glance-kuttl-tests/glance-2e4d-account-create-pgl4w" Oct 11 04:11:45 crc kubenswrapper[4703]: I1011 04:11:45.005392 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-swq2v\" (UniqueName: \"kubernetes.io/projected/3afca6b0-45f3-460d-864b-b0de72f2e817-kube-api-access-swq2v\") pod \"glance-2e4d-account-create-pgl4w\" (UID: \"3afca6b0-45f3-460d-864b-b0de72f2e817\") " pod="glance-kuttl-tests/glance-2e4d-account-create-pgl4w" Oct 11 04:11:45 crc kubenswrapper[4703]: I1011 04:11:45.042919 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-2e4d-account-create-pgl4w" Oct 11 04:11:45 crc kubenswrapper[4703]: I1011 04:11:45.438976 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-2e4d-account-create-pgl4w"] Oct 11 04:11:45 crc kubenswrapper[4703]: I1011 04:11:45.930379 4703 generic.go:334] "Generic (PLEG): container finished" podID="3afca6b0-45f3-460d-864b-b0de72f2e817" containerID="bc6efe063baa783056e245f2a02bbb9c8b21b34b03786d991f0426384ae6b188" exitCode=0 Oct 11 04:11:45 crc kubenswrapper[4703]: I1011 04:11:45.930495 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-2e4d-account-create-pgl4w" event={"ID":"3afca6b0-45f3-460d-864b-b0de72f2e817","Type":"ContainerDied","Data":"bc6efe063baa783056e245f2a02bbb9c8b21b34b03786d991f0426384ae6b188"} Oct 11 04:11:45 crc kubenswrapper[4703]: I1011 04:11:45.930836 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-2e4d-account-create-pgl4w" event={"ID":"3afca6b0-45f3-460d-864b-b0de72f2e817","Type":"ContainerStarted","Data":"aba79cfd9e2de28e74cfe9d725931a3d41a6694c0afc38016a582aa0e1ba7941"} Oct 11 04:11:47 crc kubenswrapper[4703]: I1011 04:11:47.279363 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-2e4d-account-create-pgl4w" Oct 11 04:11:47 crc kubenswrapper[4703]: I1011 04:11:47.377339 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-swq2v\" (UniqueName: \"kubernetes.io/projected/3afca6b0-45f3-460d-864b-b0de72f2e817-kube-api-access-swq2v\") pod \"3afca6b0-45f3-460d-864b-b0de72f2e817\" (UID: \"3afca6b0-45f3-460d-864b-b0de72f2e817\") " Oct 11 04:11:47 crc kubenswrapper[4703]: I1011 04:11:47.381577 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3afca6b0-45f3-460d-864b-b0de72f2e817-kube-api-access-swq2v" (OuterVolumeSpecName: "kube-api-access-swq2v") pod "3afca6b0-45f3-460d-864b-b0de72f2e817" (UID: "3afca6b0-45f3-460d-864b-b0de72f2e817"). InnerVolumeSpecName "kube-api-access-swq2v". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 04:11:47 crc kubenswrapper[4703]: I1011 04:11:47.479188 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-swq2v\" (UniqueName: \"kubernetes.io/projected/3afca6b0-45f3-460d-864b-b0de72f2e817-kube-api-access-swq2v\") on node \"crc\" DevicePath \"\"" Oct 11 04:11:47 crc kubenswrapper[4703]: I1011 04:11:47.948439 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-2e4d-account-create-pgl4w" event={"ID":"3afca6b0-45f3-460d-864b-b0de72f2e817","Type":"ContainerDied","Data":"aba79cfd9e2de28e74cfe9d725931a3d41a6694c0afc38016a582aa0e1ba7941"} Oct 11 04:11:47 crc kubenswrapper[4703]: I1011 04:11:47.948504 4703 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aba79cfd9e2de28e74cfe9d725931a3d41a6694c0afc38016a582aa0e1ba7941" Oct 11 04:11:47 crc kubenswrapper[4703]: I1011 04:11:47.948617 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-2e4d-account-create-pgl4w" Oct 11 04:11:49 crc kubenswrapper[4703]: I1011 04:11:49.870354 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-db-sync-tcwg2"] Oct 11 04:11:49 crc kubenswrapper[4703]: E1011 04:11:49.870868 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3afca6b0-45f3-460d-864b-b0de72f2e817" containerName="mariadb-account-create" Oct 11 04:11:49 crc kubenswrapper[4703]: I1011 04:11:49.870885 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="3afca6b0-45f3-460d-864b-b0de72f2e817" containerName="mariadb-account-create" Oct 11 04:11:49 crc kubenswrapper[4703]: I1011 04:11:49.871044 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="3afca6b0-45f3-460d-864b-b0de72f2e817" containerName="mariadb-account-create" Oct 11 04:11:49 crc kubenswrapper[4703]: I1011 04:11:49.871529 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-tcwg2" Oct 11 04:11:49 crc kubenswrapper[4703]: I1011 04:11:49.874145 4703 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-glance-dockercfg-dqx54" Oct 11 04:11:49 crc kubenswrapper[4703]: I1011 04:11:49.877555 4703 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-config-data" Oct 11 04:11:49 crc kubenswrapper[4703]: I1011 04:11:49.877568 4703 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"combined-ca-bundle" Oct 11 04:11:49 crc kubenswrapper[4703]: I1011 04:11:49.886847 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-sync-tcwg2"] Oct 11 04:11:50 crc kubenswrapper[4703]: I1011 04:11:50.022430 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/455b5b3f-2658-49e5-91ef-b6b687431c79-config-data\") pod \"glance-db-sync-tcwg2\" (UID: \"455b5b3f-2658-49e5-91ef-b6b687431c79\") " pod="glance-kuttl-tests/glance-db-sync-tcwg2" Oct 11 04:11:50 crc kubenswrapper[4703]: I1011 04:11:50.022710 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/455b5b3f-2658-49e5-91ef-b6b687431c79-combined-ca-bundle\") pod \"glance-db-sync-tcwg2\" (UID: \"455b5b3f-2658-49e5-91ef-b6b687431c79\") " pod="glance-kuttl-tests/glance-db-sync-tcwg2" Oct 11 04:11:50 crc kubenswrapper[4703]: I1011 04:11:50.022868 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/455b5b3f-2658-49e5-91ef-b6b687431c79-db-sync-config-data\") pod \"glance-db-sync-tcwg2\" (UID: \"455b5b3f-2658-49e5-91ef-b6b687431c79\") " pod="glance-kuttl-tests/glance-db-sync-tcwg2" Oct 11 04:11:50 crc kubenswrapper[4703]: I1011 04:11:50.023006 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hsvnx\" (UniqueName: \"kubernetes.io/projected/455b5b3f-2658-49e5-91ef-b6b687431c79-kube-api-access-hsvnx\") pod \"glance-db-sync-tcwg2\" (UID: \"455b5b3f-2658-49e5-91ef-b6b687431c79\") " pod="glance-kuttl-tests/glance-db-sync-tcwg2" Oct 11 04:11:50 crc kubenswrapper[4703]: I1011 04:11:50.124381 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/455b5b3f-2658-49e5-91ef-b6b687431c79-config-data\") pod \"glance-db-sync-tcwg2\" (UID: \"455b5b3f-2658-49e5-91ef-b6b687431c79\") " pod="glance-kuttl-tests/glance-db-sync-tcwg2" Oct 11 04:11:50 crc kubenswrapper[4703]: I1011 04:11:50.124455 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/455b5b3f-2658-49e5-91ef-b6b687431c79-combined-ca-bundle\") pod \"glance-db-sync-tcwg2\" (UID: \"455b5b3f-2658-49e5-91ef-b6b687431c79\") " pod="glance-kuttl-tests/glance-db-sync-tcwg2" Oct 11 04:11:50 crc kubenswrapper[4703]: I1011 04:11:50.124553 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/455b5b3f-2658-49e5-91ef-b6b687431c79-db-sync-config-data\") pod \"glance-db-sync-tcwg2\" (UID: \"455b5b3f-2658-49e5-91ef-b6b687431c79\") " pod="glance-kuttl-tests/glance-db-sync-tcwg2" Oct 11 04:11:50 crc kubenswrapper[4703]: I1011 04:11:50.124595 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hsvnx\" (UniqueName: \"kubernetes.io/projected/455b5b3f-2658-49e5-91ef-b6b687431c79-kube-api-access-hsvnx\") pod \"glance-db-sync-tcwg2\" (UID: \"455b5b3f-2658-49e5-91ef-b6b687431c79\") " pod="glance-kuttl-tests/glance-db-sync-tcwg2" Oct 11 04:11:50 crc kubenswrapper[4703]: I1011 04:11:50.128903 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/455b5b3f-2658-49e5-91ef-b6b687431c79-combined-ca-bundle\") pod \"glance-db-sync-tcwg2\" (UID: \"455b5b3f-2658-49e5-91ef-b6b687431c79\") " pod="glance-kuttl-tests/glance-db-sync-tcwg2" Oct 11 04:11:50 crc kubenswrapper[4703]: I1011 04:11:50.129390 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/455b5b3f-2658-49e5-91ef-b6b687431c79-db-sync-config-data\") pod \"glance-db-sync-tcwg2\" (UID: \"455b5b3f-2658-49e5-91ef-b6b687431c79\") " pod="glance-kuttl-tests/glance-db-sync-tcwg2" Oct 11 04:11:50 crc kubenswrapper[4703]: I1011 04:11:50.130705 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/455b5b3f-2658-49e5-91ef-b6b687431c79-config-data\") pod \"glance-db-sync-tcwg2\" (UID: \"455b5b3f-2658-49e5-91ef-b6b687431c79\") " pod="glance-kuttl-tests/glance-db-sync-tcwg2" Oct 11 04:11:50 crc kubenswrapper[4703]: I1011 04:11:50.141720 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hsvnx\" (UniqueName: \"kubernetes.io/projected/455b5b3f-2658-49e5-91ef-b6b687431c79-kube-api-access-hsvnx\") pod \"glance-db-sync-tcwg2\" (UID: \"455b5b3f-2658-49e5-91ef-b6b687431c79\") " pod="glance-kuttl-tests/glance-db-sync-tcwg2" Oct 11 04:11:50 crc kubenswrapper[4703]: I1011 04:11:50.195720 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-tcwg2" Oct 11 04:11:50 crc kubenswrapper[4703]: I1011 04:11:50.259191 4703 patch_prober.go:28] interesting pod/machine-config-daemon-6b7d5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 11 04:11:50 crc kubenswrapper[4703]: I1011 04:11:50.259560 4703 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6b7d5" podUID="74f832fc-6791-47d6-a9b3-07d923e053dc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 11 04:11:50 crc kubenswrapper[4703]: I1011 04:11:50.259615 4703 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-6b7d5" Oct 11 04:11:50 crc kubenswrapper[4703]: I1011 04:11:50.260148 4703 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"16bd76b9758ba6b1af68c1c3f4a9fecfcc95697ade75d95a9f0798b6a633f439"} pod="openshift-machine-config-operator/machine-config-daemon-6b7d5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 11 04:11:50 crc kubenswrapper[4703]: I1011 04:11:50.260199 4703 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-6b7d5" podUID="74f832fc-6791-47d6-a9b3-07d923e053dc" containerName="machine-config-daemon" containerID="cri-o://16bd76b9758ba6b1af68c1c3f4a9fecfcc95697ade75d95a9f0798b6a633f439" gracePeriod=600 Oct 11 04:11:50 crc kubenswrapper[4703]: I1011 04:11:50.444174 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-sync-tcwg2"] Oct 11 04:11:50 crc kubenswrapper[4703]: W1011 04:11:50.450249 4703 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod455b5b3f_2658_49e5_91ef_b6b687431c79.slice/crio-6fe03f38383406f729f7ed0c42ea02184410556a2cf938676b3651c7b0955989 WatchSource:0}: Error finding container 6fe03f38383406f729f7ed0c42ea02184410556a2cf938676b3651c7b0955989: Status 404 returned error can't find the container with id 6fe03f38383406f729f7ed0c42ea02184410556a2cf938676b3651c7b0955989 Oct 11 04:11:50 crc kubenswrapper[4703]: I1011 04:11:50.974541 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-tcwg2" event={"ID":"455b5b3f-2658-49e5-91ef-b6b687431c79","Type":"ContainerStarted","Data":"5d69e32ed0904675d392fd4a15d7d761c77f9792b20402c42e2b00b03ef730df"} Oct 11 04:11:50 crc kubenswrapper[4703]: I1011 04:11:50.974852 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-tcwg2" event={"ID":"455b5b3f-2658-49e5-91ef-b6b687431c79","Type":"ContainerStarted","Data":"6fe03f38383406f729f7ed0c42ea02184410556a2cf938676b3651c7b0955989"} Oct 11 04:11:50 crc kubenswrapper[4703]: I1011 04:11:50.976985 4703 generic.go:334] "Generic (PLEG): container finished" podID="74f832fc-6791-47d6-a9b3-07d923e053dc" containerID="16bd76b9758ba6b1af68c1c3f4a9fecfcc95697ade75d95a9f0798b6a633f439" exitCode=0 Oct 11 04:11:50 crc kubenswrapper[4703]: I1011 04:11:50.977037 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6b7d5" event={"ID":"74f832fc-6791-47d6-a9b3-07d923e053dc","Type":"ContainerDied","Data":"16bd76b9758ba6b1af68c1c3f4a9fecfcc95697ade75d95a9f0798b6a633f439"} Oct 11 04:11:50 crc kubenswrapper[4703]: I1011 04:11:50.977059 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6b7d5" event={"ID":"74f832fc-6791-47d6-a9b3-07d923e053dc","Type":"ContainerStarted","Data":"399509657f5ec21d3ece8fb671d017bd6349c4dadb4c055bb8e50807bdb71cf9"} Oct 11 04:11:50 crc kubenswrapper[4703]: I1011 04:11:50.977091 4703 scope.go:117] "RemoveContainer" containerID="0e380b9b01cda230cd0b9c8f75d2fe209660f6bd6b7bddb2546a440639390852" Oct 11 04:11:50 crc kubenswrapper[4703]: I1011 04:11:50.996604 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-db-sync-tcwg2" podStartSLOduration=1.996586489 podStartE2EDuration="1.996586489s" podCreationTimestamp="2025-10-11 04:11:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 04:11:50.995943693 +0000 UTC m=+1002.206425615" watchObservedRunningTime="2025-10-11 04:11:50.996586489 +0000 UTC m=+1002.207068411" Oct 11 04:11:54 crc kubenswrapper[4703]: I1011 04:11:54.005692 4703 generic.go:334] "Generic (PLEG): container finished" podID="455b5b3f-2658-49e5-91ef-b6b687431c79" containerID="5d69e32ed0904675d392fd4a15d7d761c77f9792b20402c42e2b00b03ef730df" exitCode=0 Oct 11 04:11:54 crc kubenswrapper[4703]: I1011 04:11:54.005802 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-tcwg2" event={"ID":"455b5b3f-2658-49e5-91ef-b6b687431c79","Type":"ContainerDied","Data":"5d69e32ed0904675d392fd4a15d7d761c77f9792b20402c42e2b00b03ef730df"} Oct 11 04:11:55 crc kubenswrapper[4703]: I1011 04:11:55.330013 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-tcwg2" Oct 11 04:11:55 crc kubenswrapper[4703]: I1011 04:11:55.503416 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/455b5b3f-2658-49e5-91ef-b6b687431c79-combined-ca-bundle\") pod \"455b5b3f-2658-49e5-91ef-b6b687431c79\" (UID: \"455b5b3f-2658-49e5-91ef-b6b687431c79\") " Oct 11 04:11:55 crc kubenswrapper[4703]: I1011 04:11:55.503530 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hsvnx\" (UniqueName: \"kubernetes.io/projected/455b5b3f-2658-49e5-91ef-b6b687431c79-kube-api-access-hsvnx\") pod \"455b5b3f-2658-49e5-91ef-b6b687431c79\" (UID: \"455b5b3f-2658-49e5-91ef-b6b687431c79\") " Oct 11 04:11:55 crc kubenswrapper[4703]: I1011 04:11:55.503677 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/455b5b3f-2658-49e5-91ef-b6b687431c79-db-sync-config-data\") pod \"455b5b3f-2658-49e5-91ef-b6b687431c79\" (UID: \"455b5b3f-2658-49e5-91ef-b6b687431c79\") " Oct 11 04:11:55 crc kubenswrapper[4703]: I1011 04:11:55.503715 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/455b5b3f-2658-49e5-91ef-b6b687431c79-config-data\") pod \"455b5b3f-2658-49e5-91ef-b6b687431c79\" (UID: \"455b5b3f-2658-49e5-91ef-b6b687431c79\") " Oct 11 04:11:55 crc kubenswrapper[4703]: I1011 04:11:55.508938 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/455b5b3f-2658-49e5-91ef-b6b687431c79-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "455b5b3f-2658-49e5-91ef-b6b687431c79" (UID: "455b5b3f-2658-49e5-91ef-b6b687431c79"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 04:11:55 crc kubenswrapper[4703]: I1011 04:11:55.509672 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/455b5b3f-2658-49e5-91ef-b6b687431c79-kube-api-access-hsvnx" (OuterVolumeSpecName: "kube-api-access-hsvnx") pod "455b5b3f-2658-49e5-91ef-b6b687431c79" (UID: "455b5b3f-2658-49e5-91ef-b6b687431c79"). InnerVolumeSpecName "kube-api-access-hsvnx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 04:11:55 crc kubenswrapper[4703]: I1011 04:11:55.539723 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/455b5b3f-2658-49e5-91ef-b6b687431c79-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "455b5b3f-2658-49e5-91ef-b6b687431c79" (UID: "455b5b3f-2658-49e5-91ef-b6b687431c79"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 04:11:55 crc kubenswrapper[4703]: I1011 04:11:55.561383 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/455b5b3f-2658-49e5-91ef-b6b687431c79-config-data" (OuterVolumeSpecName: "config-data") pod "455b5b3f-2658-49e5-91ef-b6b687431c79" (UID: "455b5b3f-2658-49e5-91ef-b6b687431c79"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 04:11:55 crc kubenswrapper[4703]: I1011 04:11:55.606541 4703 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/455b5b3f-2658-49e5-91ef-b6b687431c79-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 11 04:11:55 crc kubenswrapper[4703]: I1011 04:11:55.606591 4703 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/455b5b3f-2658-49e5-91ef-b6b687431c79-config-data\") on node \"crc\" DevicePath \"\"" Oct 11 04:11:55 crc kubenswrapper[4703]: I1011 04:11:55.606611 4703 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/455b5b3f-2658-49e5-91ef-b6b687431c79-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 11 04:11:55 crc kubenswrapper[4703]: I1011 04:11:55.606632 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hsvnx\" (UniqueName: \"kubernetes.io/projected/455b5b3f-2658-49e5-91ef-b6b687431c79-kube-api-access-hsvnx\") on node \"crc\" DevicePath \"\"" Oct 11 04:11:56 crc kubenswrapper[4703]: I1011 04:11:56.023988 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-tcwg2" event={"ID":"455b5b3f-2658-49e5-91ef-b6b687431c79","Type":"ContainerDied","Data":"6fe03f38383406f729f7ed0c42ea02184410556a2cf938676b3651c7b0955989"} Oct 11 04:11:56 crc kubenswrapper[4703]: I1011 04:11:56.024045 4703 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6fe03f38383406f729f7ed0c42ea02184410556a2cf938676b3651c7b0955989" Oct 11 04:11:56 crc kubenswrapper[4703]: I1011 04:11:56.024124 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-tcwg2" Oct 11 04:11:56 crc kubenswrapper[4703]: I1011 04:11:56.404575 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Oct 11 04:11:56 crc kubenswrapper[4703]: E1011 04:11:56.405094 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="455b5b3f-2658-49e5-91ef-b6b687431c79" containerName="glance-db-sync" Oct 11 04:11:56 crc kubenswrapper[4703]: I1011 04:11:56.405109 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="455b5b3f-2658-49e5-91ef-b6b687431c79" containerName="glance-db-sync" Oct 11 04:11:56 crc kubenswrapper[4703]: I1011 04:11:56.405294 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="455b5b3f-2658-49e5-91ef-b6b687431c79" containerName="glance-db-sync" Oct 11 04:11:56 crc kubenswrapper[4703]: I1011 04:11:56.406091 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Oct 11 04:11:56 crc kubenswrapper[4703]: I1011 04:11:56.407817 4703 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-default-single-config-data" Oct 11 04:11:56 crc kubenswrapper[4703]: I1011 04:11:56.409027 4703 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-glance-dockercfg-dqx54" Oct 11 04:11:56 crc kubenswrapper[4703]: I1011 04:11:56.409083 4703 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"combined-ca-bundle" Oct 11 04:11:56 crc kubenswrapper[4703]: I1011 04:11:56.410575 4703 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-scripts" Oct 11 04:11:56 crc kubenswrapper[4703]: I1011 04:11:56.410949 4703 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"cert-glance-default-internal-svc" Oct 11 04:11:56 crc kubenswrapper[4703]: I1011 04:11:56.417739 4703 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"cert-glance-default-public-svc" Oct 11 04:11:56 crc kubenswrapper[4703]: I1011 04:11:56.419507 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Oct 11 04:11:56 crc kubenswrapper[4703]: I1011 04:11:56.521771 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58fa50a0-09ef-4b9c-b8ea-0104221c0581-config-data\") pod \"glance-default-single-0\" (UID: \"58fa50a0-09ef-4b9c-b8ea-0104221c0581\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 11 04:11:56 crc kubenswrapper[4703]: I1011 04:11:56.521822 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/58fa50a0-09ef-4b9c-b8ea-0104221c0581-internal-tls-certs\") pod \"glance-default-single-0\" (UID: \"58fa50a0-09ef-4b9c-b8ea-0104221c0581\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 11 04:11:56 crc kubenswrapper[4703]: I1011 04:11:56.521855 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/58fa50a0-09ef-4b9c-b8ea-0104221c0581-logs\") pod \"glance-default-single-0\" (UID: \"58fa50a0-09ef-4b9c-b8ea-0104221c0581\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 11 04:11:56 crc kubenswrapper[4703]: I1011 04:11:56.521878 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/58fa50a0-09ef-4b9c-b8ea-0104221c0581-public-tls-certs\") pod \"glance-default-single-0\" (UID: \"58fa50a0-09ef-4b9c-b8ea-0104221c0581\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 11 04:11:56 crc kubenswrapper[4703]: I1011 04:11:56.521905 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l95jm\" (UniqueName: \"kubernetes.io/projected/58fa50a0-09ef-4b9c-b8ea-0104221c0581-kube-api-access-l95jm\") pod \"glance-default-single-0\" (UID: \"58fa50a0-09ef-4b9c-b8ea-0104221c0581\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 11 04:11:56 crc kubenswrapper[4703]: I1011 04:11:56.521952 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/58fa50a0-09ef-4b9c-b8ea-0104221c0581-httpd-run\") pod \"glance-default-single-0\" (UID: \"58fa50a0-09ef-4b9c-b8ea-0104221c0581\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 11 04:11:56 crc kubenswrapper[4703]: I1011 04:11:56.522054 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/58fa50a0-09ef-4b9c-b8ea-0104221c0581-scripts\") pod \"glance-default-single-0\" (UID: \"58fa50a0-09ef-4b9c-b8ea-0104221c0581\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 11 04:11:56 crc kubenswrapper[4703]: I1011 04:11:56.522147 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-single-0\" (UID: \"58fa50a0-09ef-4b9c-b8ea-0104221c0581\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 11 04:11:56 crc kubenswrapper[4703]: I1011 04:11:56.522225 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58fa50a0-09ef-4b9c-b8ea-0104221c0581-combined-ca-bundle\") pod \"glance-default-single-0\" (UID: \"58fa50a0-09ef-4b9c-b8ea-0104221c0581\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 11 04:11:56 crc kubenswrapper[4703]: I1011 04:11:56.623557 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-single-0\" (UID: \"58fa50a0-09ef-4b9c-b8ea-0104221c0581\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 11 04:11:56 crc kubenswrapper[4703]: I1011 04:11:56.623619 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58fa50a0-09ef-4b9c-b8ea-0104221c0581-combined-ca-bundle\") pod \"glance-default-single-0\" (UID: \"58fa50a0-09ef-4b9c-b8ea-0104221c0581\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 11 04:11:56 crc kubenswrapper[4703]: I1011 04:11:56.623660 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58fa50a0-09ef-4b9c-b8ea-0104221c0581-config-data\") pod \"glance-default-single-0\" (UID: \"58fa50a0-09ef-4b9c-b8ea-0104221c0581\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 11 04:11:56 crc kubenswrapper[4703]: I1011 04:11:56.623688 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/58fa50a0-09ef-4b9c-b8ea-0104221c0581-internal-tls-certs\") pod \"glance-default-single-0\" (UID: \"58fa50a0-09ef-4b9c-b8ea-0104221c0581\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 11 04:11:56 crc kubenswrapper[4703]: I1011 04:11:56.623721 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/58fa50a0-09ef-4b9c-b8ea-0104221c0581-logs\") pod \"glance-default-single-0\" (UID: \"58fa50a0-09ef-4b9c-b8ea-0104221c0581\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 11 04:11:56 crc kubenswrapper[4703]: I1011 04:11:56.623746 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/58fa50a0-09ef-4b9c-b8ea-0104221c0581-public-tls-certs\") pod \"glance-default-single-0\" (UID: \"58fa50a0-09ef-4b9c-b8ea-0104221c0581\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 11 04:11:56 crc kubenswrapper[4703]: I1011 04:11:56.623777 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l95jm\" (UniqueName: \"kubernetes.io/projected/58fa50a0-09ef-4b9c-b8ea-0104221c0581-kube-api-access-l95jm\") pod \"glance-default-single-0\" (UID: \"58fa50a0-09ef-4b9c-b8ea-0104221c0581\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 11 04:11:56 crc kubenswrapper[4703]: I1011 04:11:56.623804 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/58fa50a0-09ef-4b9c-b8ea-0104221c0581-httpd-run\") pod \"glance-default-single-0\" (UID: \"58fa50a0-09ef-4b9c-b8ea-0104221c0581\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 11 04:11:56 crc kubenswrapper[4703]: I1011 04:11:56.623861 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/58fa50a0-09ef-4b9c-b8ea-0104221c0581-scripts\") pod \"glance-default-single-0\" (UID: \"58fa50a0-09ef-4b9c-b8ea-0104221c0581\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 11 04:11:56 crc kubenswrapper[4703]: I1011 04:11:56.623893 4703 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-single-0\" (UID: \"58fa50a0-09ef-4b9c-b8ea-0104221c0581\") device mount path \"/mnt/openstack/pv11\"" pod="glance-kuttl-tests/glance-default-single-0" Oct 11 04:11:56 crc kubenswrapper[4703]: I1011 04:11:56.624815 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/58fa50a0-09ef-4b9c-b8ea-0104221c0581-httpd-run\") pod \"glance-default-single-0\" (UID: \"58fa50a0-09ef-4b9c-b8ea-0104221c0581\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 11 04:11:56 crc kubenswrapper[4703]: I1011 04:11:56.625119 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/58fa50a0-09ef-4b9c-b8ea-0104221c0581-logs\") pod \"glance-default-single-0\" (UID: \"58fa50a0-09ef-4b9c-b8ea-0104221c0581\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 11 04:11:56 crc kubenswrapper[4703]: I1011 04:11:56.629758 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58fa50a0-09ef-4b9c-b8ea-0104221c0581-config-data\") pod \"glance-default-single-0\" (UID: \"58fa50a0-09ef-4b9c-b8ea-0104221c0581\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 11 04:11:56 crc kubenswrapper[4703]: I1011 04:11:56.631313 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58fa50a0-09ef-4b9c-b8ea-0104221c0581-combined-ca-bundle\") pod \"glance-default-single-0\" (UID: \"58fa50a0-09ef-4b9c-b8ea-0104221c0581\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 11 04:11:56 crc kubenswrapper[4703]: I1011 04:11:56.637022 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/58fa50a0-09ef-4b9c-b8ea-0104221c0581-internal-tls-certs\") pod \"glance-default-single-0\" (UID: \"58fa50a0-09ef-4b9c-b8ea-0104221c0581\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 11 04:11:56 crc kubenswrapper[4703]: I1011 04:11:56.637273 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/58fa50a0-09ef-4b9c-b8ea-0104221c0581-public-tls-certs\") pod \"glance-default-single-0\" (UID: \"58fa50a0-09ef-4b9c-b8ea-0104221c0581\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 11 04:11:56 crc kubenswrapper[4703]: I1011 04:11:56.643015 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l95jm\" (UniqueName: \"kubernetes.io/projected/58fa50a0-09ef-4b9c-b8ea-0104221c0581-kube-api-access-l95jm\") pod \"glance-default-single-0\" (UID: \"58fa50a0-09ef-4b9c-b8ea-0104221c0581\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 11 04:11:56 crc kubenswrapper[4703]: I1011 04:11:56.643137 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-single-0\" (UID: \"58fa50a0-09ef-4b9c-b8ea-0104221c0581\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 11 04:11:56 crc kubenswrapper[4703]: I1011 04:11:56.645862 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/58fa50a0-09ef-4b9c-b8ea-0104221c0581-scripts\") pod \"glance-default-single-0\" (UID: \"58fa50a0-09ef-4b9c-b8ea-0104221c0581\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 11 04:11:56 crc kubenswrapper[4703]: I1011 04:11:56.721757 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Oct 11 04:11:57 crc kubenswrapper[4703]: I1011 04:11:57.172851 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Oct 11 04:11:57 crc kubenswrapper[4703]: W1011 04:11:57.177589 4703 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod58fa50a0_09ef_4b9c_b8ea_0104221c0581.slice/crio-43d10ab45398a40519351a6631c742c64952d88846f0160841c2c1d39c7d1a42 WatchSource:0}: Error finding container 43d10ab45398a40519351a6631c742c64952d88846f0160841c2c1d39c7d1a42: Status 404 returned error can't find the container with id 43d10ab45398a40519351a6631c742c64952d88846f0160841c2c1d39c7d1a42 Oct 11 04:11:57 crc kubenswrapper[4703]: I1011 04:11:57.606509 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Oct 11 04:11:58 crc kubenswrapper[4703]: I1011 04:11:58.039282 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"58fa50a0-09ef-4b9c-b8ea-0104221c0581","Type":"ContainerStarted","Data":"be06f9df2dadf0ffa8305196df064506ad37cb7b7b58c84de0dec5de6dab14f4"} Oct 11 04:11:58 crc kubenswrapper[4703]: I1011 04:11:58.039335 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"58fa50a0-09ef-4b9c-b8ea-0104221c0581","Type":"ContainerStarted","Data":"43d10ab45398a40519351a6631c742c64952d88846f0160841c2c1d39c7d1a42"} Oct 11 04:11:59 crc kubenswrapper[4703]: I1011 04:11:59.055351 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"58fa50a0-09ef-4b9c-b8ea-0104221c0581","Type":"ContainerStarted","Data":"8218e051a2319043e2dee048df8691a144aef362f9a430e39a9e3c9f6d918d21"} Oct 11 04:11:59 crc kubenswrapper[4703]: I1011 04:11:59.055644 4703 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-single-0" podUID="58fa50a0-09ef-4b9c-b8ea-0104221c0581" containerName="glance-log" containerID="cri-o://be06f9df2dadf0ffa8305196df064506ad37cb7b7b58c84de0dec5de6dab14f4" gracePeriod=30 Oct 11 04:11:59 crc kubenswrapper[4703]: I1011 04:11:59.055712 4703 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-single-0" podUID="58fa50a0-09ef-4b9c-b8ea-0104221c0581" containerName="glance-httpd" containerID="cri-o://8218e051a2319043e2dee048df8691a144aef362f9a430e39a9e3c9f6d918d21" gracePeriod=30 Oct 11 04:11:59 crc kubenswrapper[4703]: I1011 04:11:59.103991 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-single-0" podStartSLOduration=3.103963632 podStartE2EDuration="3.103963632s" podCreationTimestamp="2025-10-11 04:11:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 04:11:59.098175789 +0000 UTC m=+1010.308657721" watchObservedRunningTime="2025-10-11 04:11:59.103963632 +0000 UTC m=+1010.314445584" Oct 11 04:11:59 crc kubenswrapper[4703]: I1011 04:11:59.703068 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Oct 11 04:11:59 crc kubenswrapper[4703]: I1011 04:11:59.885168 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58fa50a0-09ef-4b9c-b8ea-0104221c0581-config-data\") pod \"58fa50a0-09ef-4b9c-b8ea-0104221c0581\" (UID: \"58fa50a0-09ef-4b9c-b8ea-0104221c0581\") " Oct 11 04:11:59 crc kubenswrapper[4703]: I1011 04:11:59.885361 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/58fa50a0-09ef-4b9c-b8ea-0104221c0581-internal-tls-certs\") pod \"58fa50a0-09ef-4b9c-b8ea-0104221c0581\" (UID: \"58fa50a0-09ef-4b9c-b8ea-0104221c0581\") " Oct 11 04:11:59 crc kubenswrapper[4703]: I1011 04:11:59.885434 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/58fa50a0-09ef-4b9c-b8ea-0104221c0581-scripts\") pod \"58fa50a0-09ef-4b9c-b8ea-0104221c0581\" (UID: \"58fa50a0-09ef-4b9c-b8ea-0104221c0581\") " Oct 11 04:11:59 crc kubenswrapper[4703]: I1011 04:11:59.885522 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/58fa50a0-09ef-4b9c-b8ea-0104221c0581-public-tls-certs\") pod \"58fa50a0-09ef-4b9c-b8ea-0104221c0581\" (UID: \"58fa50a0-09ef-4b9c-b8ea-0104221c0581\") " Oct 11 04:11:59 crc kubenswrapper[4703]: I1011 04:11:59.885543 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58fa50a0-09ef-4b9c-b8ea-0104221c0581-combined-ca-bundle\") pod \"58fa50a0-09ef-4b9c-b8ea-0104221c0581\" (UID: \"58fa50a0-09ef-4b9c-b8ea-0104221c0581\") " Oct 11 04:11:59 crc kubenswrapper[4703]: I1011 04:11:59.885566 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/58fa50a0-09ef-4b9c-b8ea-0104221c0581-logs\") pod \"58fa50a0-09ef-4b9c-b8ea-0104221c0581\" (UID: \"58fa50a0-09ef-4b9c-b8ea-0104221c0581\") " Oct 11 04:11:59 crc kubenswrapper[4703]: I1011 04:11:59.885585 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l95jm\" (UniqueName: \"kubernetes.io/projected/58fa50a0-09ef-4b9c-b8ea-0104221c0581-kube-api-access-l95jm\") pod \"58fa50a0-09ef-4b9c-b8ea-0104221c0581\" (UID: \"58fa50a0-09ef-4b9c-b8ea-0104221c0581\") " Oct 11 04:11:59 crc kubenswrapper[4703]: I1011 04:11:59.885608 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"58fa50a0-09ef-4b9c-b8ea-0104221c0581\" (UID: \"58fa50a0-09ef-4b9c-b8ea-0104221c0581\") " Oct 11 04:11:59 crc kubenswrapper[4703]: I1011 04:11:59.885657 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/58fa50a0-09ef-4b9c-b8ea-0104221c0581-httpd-run\") pod \"58fa50a0-09ef-4b9c-b8ea-0104221c0581\" (UID: \"58fa50a0-09ef-4b9c-b8ea-0104221c0581\") " Oct 11 04:11:59 crc kubenswrapper[4703]: I1011 04:11:59.886592 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/58fa50a0-09ef-4b9c-b8ea-0104221c0581-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "58fa50a0-09ef-4b9c-b8ea-0104221c0581" (UID: "58fa50a0-09ef-4b9c-b8ea-0104221c0581"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 04:11:59 crc kubenswrapper[4703]: I1011 04:11:59.886734 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/58fa50a0-09ef-4b9c-b8ea-0104221c0581-logs" (OuterVolumeSpecName: "logs") pod "58fa50a0-09ef-4b9c-b8ea-0104221c0581" (UID: "58fa50a0-09ef-4b9c-b8ea-0104221c0581"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 04:11:59 crc kubenswrapper[4703]: I1011 04:11:59.894174 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/58fa50a0-09ef-4b9c-b8ea-0104221c0581-kube-api-access-l95jm" (OuterVolumeSpecName: "kube-api-access-l95jm") pod "58fa50a0-09ef-4b9c-b8ea-0104221c0581" (UID: "58fa50a0-09ef-4b9c-b8ea-0104221c0581"). InnerVolumeSpecName "kube-api-access-l95jm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 04:11:59 crc kubenswrapper[4703]: I1011 04:11:59.894321 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "glance") pod "58fa50a0-09ef-4b9c-b8ea-0104221c0581" (UID: "58fa50a0-09ef-4b9c-b8ea-0104221c0581"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 11 04:11:59 crc kubenswrapper[4703]: I1011 04:11:59.903205 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58fa50a0-09ef-4b9c-b8ea-0104221c0581-scripts" (OuterVolumeSpecName: "scripts") pod "58fa50a0-09ef-4b9c-b8ea-0104221c0581" (UID: "58fa50a0-09ef-4b9c-b8ea-0104221c0581"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 04:11:59 crc kubenswrapper[4703]: I1011 04:11:59.917886 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58fa50a0-09ef-4b9c-b8ea-0104221c0581-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "58fa50a0-09ef-4b9c-b8ea-0104221c0581" (UID: "58fa50a0-09ef-4b9c-b8ea-0104221c0581"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 04:11:59 crc kubenswrapper[4703]: I1011 04:11:59.933680 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58fa50a0-09ef-4b9c-b8ea-0104221c0581-config-data" (OuterVolumeSpecName: "config-data") pod "58fa50a0-09ef-4b9c-b8ea-0104221c0581" (UID: "58fa50a0-09ef-4b9c-b8ea-0104221c0581"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 04:11:59 crc kubenswrapper[4703]: I1011 04:11:59.938360 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58fa50a0-09ef-4b9c-b8ea-0104221c0581-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "58fa50a0-09ef-4b9c-b8ea-0104221c0581" (UID: "58fa50a0-09ef-4b9c-b8ea-0104221c0581"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 04:11:59 crc kubenswrapper[4703]: I1011 04:11:59.944240 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58fa50a0-09ef-4b9c-b8ea-0104221c0581-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "58fa50a0-09ef-4b9c-b8ea-0104221c0581" (UID: "58fa50a0-09ef-4b9c-b8ea-0104221c0581"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 04:11:59 crc kubenswrapper[4703]: I1011 04:11:59.986983 4703 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/58fa50a0-09ef-4b9c-b8ea-0104221c0581-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 11 04:11:59 crc kubenswrapper[4703]: I1011 04:11:59.987203 4703 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58fa50a0-09ef-4b9c-b8ea-0104221c0581-config-data\") on node \"crc\" DevicePath \"\"" Oct 11 04:11:59 crc kubenswrapper[4703]: I1011 04:11:59.987320 4703 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/58fa50a0-09ef-4b9c-b8ea-0104221c0581-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 11 04:11:59 crc kubenswrapper[4703]: I1011 04:11:59.987402 4703 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/58fa50a0-09ef-4b9c-b8ea-0104221c0581-scripts\") on node \"crc\" DevicePath \"\"" Oct 11 04:11:59 crc kubenswrapper[4703]: I1011 04:11:59.987499 4703 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/58fa50a0-09ef-4b9c-b8ea-0104221c0581-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 11 04:11:59 crc kubenswrapper[4703]: I1011 04:11:59.987583 4703 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58fa50a0-09ef-4b9c-b8ea-0104221c0581-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 11 04:11:59 crc kubenswrapper[4703]: I1011 04:11:59.987683 4703 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/58fa50a0-09ef-4b9c-b8ea-0104221c0581-logs\") on node \"crc\" DevicePath \"\"" Oct 11 04:11:59 crc kubenswrapper[4703]: I1011 04:11:59.987787 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l95jm\" (UniqueName: \"kubernetes.io/projected/58fa50a0-09ef-4b9c-b8ea-0104221c0581-kube-api-access-l95jm\") on node \"crc\" DevicePath \"\"" Oct 11 04:11:59 crc kubenswrapper[4703]: I1011 04:11:59.988299 4703 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Oct 11 04:12:00 crc kubenswrapper[4703]: I1011 04:12:00.009402 4703 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Oct 11 04:12:00 crc kubenswrapper[4703]: I1011 04:12:00.067150 4703 generic.go:334] "Generic (PLEG): container finished" podID="58fa50a0-09ef-4b9c-b8ea-0104221c0581" containerID="8218e051a2319043e2dee048df8691a144aef362f9a430e39a9e3c9f6d918d21" exitCode=0 Oct 11 04:12:00 crc kubenswrapper[4703]: I1011 04:12:00.067974 4703 generic.go:334] "Generic (PLEG): container finished" podID="58fa50a0-09ef-4b9c-b8ea-0104221c0581" containerID="be06f9df2dadf0ffa8305196df064506ad37cb7b7b58c84de0dec5de6dab14f4" exitCode=143 Oct 11 04:12:00 crc kubenswrapper[4703]: I1011 04:12:00.067259 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"58fa50a0-09ef-4b9c-b8ea-0104221c0581","Type":"ContainerDied","Data":"8218e051a2319043e2dee048df8691a144aef362f9a430e39a9e3c9f6d918d21"} Oct 11 04:12:00 crc kubenswrapper[4703]: I1011 04:12:00.067216 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Oct 11 04:12:00 crc kubenswrapper[4703]: I1011 04:12:00.068320 4703 scope.go:117] "RemoveContainer" containerID="8218e051a2319043e2dee048df8691a144aef362f9a430e39a9e3c9f6d918d21" Oct 11 04:12:00 crc kubenswrapper[4703]: I1011 04:12:00.068217 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"58fa50a0-09ef-4b9c-b8ea-0104221c0581","Type":"ContainerDied","Data":"be06f9df2dadf0ffa8305196df064506ad37cb7b7b58c84de0dec5de6dab14f4"} Oct 11 04:12:00 crc kubenswrapper[4703]: I1011 04:12:00.069238 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"58fa50a0-09ef-4b9c-b8ea-0104221c0581","Type":"ContainerDied","Data":"43d10ab45398a40519351a6631c742c64952d88846f0160841c2c1d39c7d1a42"} Oct 11 04:12:00 crc kubenswrapper[4703]: I1011 04:12:00.089627 4703 reconciler_common.go:293] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Oct 11 04:12:00 crc kubenswrapper[4703]: I1011 04:12:00.094766 4703 scope.go:117] "RemoveContainer" containerID="be06f9df2dadf0ffa8305196df064506ad37cb7b7b58c84de0dec5de6dab14f4" Oct 11 04:12:00 crc kubenswrapper[4703]: I1011 04:12:00.117547 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Oct 11 04:12:00 crc kubenswrapper[4703]: I1011 04:12:00.140028 4703 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Oct 11 04:12:00 crc kubenswrapper[4703]: I1011 04:12:00.144165 4703 scope.go:117] "RemoveContainer" containerID="8218e051a2319043e2dee048df8691a144aef362f9a430e39a9e3c9f6d918d21" Oct 11 04:12:00 crc kubenswrapper[4703]: I1011 04:12:00.159539 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Oct 11 04:12:00 crc kubenswrapper[4703]: E1011 04:12:00.159979 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58fa50a0-09ef-4b9c-b8ea-0104221c0581" containerName="glance-log" Oct 11 04:12:00 crc kubenswrapper[4703]: I1011 04:12:00.160004 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="58fa50a0-09ef-4b9c-b8ea-0104221c0581" containerName="glance-log" Oct 11 04:12:00 crc kubenswrapper[4703]: E1011 04:12:00.160026 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58fa50a0-09ef-4b9c-b8ea-0104221c0581" containerName="glance-httpd" Oct 11 04:12:00 crc kubenswrapper[4703]: I1011 04:12:00.160039 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="58fa50a0-09ef-4b9c-b8ea-0104221c0581" containerName="glance-httpd" Oct 11 04:12:00 crc kubenswrapper[4703]: I1011 04:12:00.160280 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="58fa50a0-09ef-4b9c-b8ea-0104221c0581" containerName="glance-httpd" Oct 11 04:12:00 crc kubenswrapper[4703]: I1011 04:12:00.160326 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="58fa50a0-09ef-4b9c-b8ea-0104221c0581" containerName="glance-log" Oct 11 04:12:00 crc kubenswrapper[4703]: E1011 04:12:00.161717 4703 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8218e051a2319043e2dee048df8691a144aef362f9a430e39a9e3c9f6d918d21\": container with ID starting with 8218e051a2319043e2dee048df8691a144aef362f9a430e39a9e3c9f6d918d21 not found: ID does not exist" containerID="8218e051a2319043e2dee048df8691a144aef362f9a430e39a9e3c9f6d918d21" Oct 11 04:12:00 crc kubenswrapper[4703]: I1011 04:12:00.161745 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Oct 11 04:12:00 crc kubenswrapper[4703]: I1011 04:12:00.161760 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8218e051a2319043e2dee048df8691a144aef362f9a430e39a9e3c9f6d918d21"} err="failed to get container status \"8218e051a2319043e2dee048df8691a144aef362f9a430e39a9e3c9f6d918d21\": rpc error: code = NotFound desc = could not find container \"8218e051a2319043e2dee048df8691a144aef362f9a430e39a9e3c9f6d918d21\": container with ID starting with 8218e051a2319043e2dee048df8691a144aef362f9a430e39a9e3c9f6d918d21 not found: ID does not exist" Oct 11 04:12:00 crc kubenswrapper[4703]: I1011 04:12:00.161799 4703 scope.go:117] "RemoveContainer" containerID="be06f9df2dadf0ffa8305196df064506ad37cb7b7b58c84de0dec5de6dab14f4" Oct 11 04:12:00 crc kubenswrapper[4703]: E1011 04:12:00.162737 4703 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"be06f9df2dadf0ffa8305196df064506ad37cb7b7b58c84de0dec5de6dab14f4\": container with ID starting with be06f9df2dadf0ffa8305196df064506ad37cb7b7b58c84de0dec5de6dab14f4 not found: ID does not exist" containerID="be06f9df2dadf0ffa8305196df064506ad37cb7b7b58c84de0dec5de6dab14f4" Oct 11 04:12:00 crc kubenswrapper[4703]: I1011 04:12:00.162889 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"be06f9df2dadf0ffa8305196df064506ad37cb7b7b58c84de0dec5de6dab14f4"} err="failed to get container status \"be06f9df2dadf0ffa8305196df064506ad37cb7b7b58c84de0dec5de6dab14f4\": rpc error: code = NotFound desc = could not find container \"be06f9df2dadf0ffa8305196df064506ad37cb7b7b58c84de0dec5de6dab14f4\": container with ID starting with be06f9df2dadf0ffa8305196df064506ad37cb7b7b58c84de0dec5de6dab14f4 not found: ID does not exist" Oct 11 04:12:00 crc kubenswrapper[4703]: I1011 04:12:00.163001 4703 scope.go:117] "RemoveContainer" containerID="8218e051a2319043e2dee048df8691a144aef362f9a430e39a9e3c9f6d918d21" Oct 11 04:12:00 crc kubenswrapper[4703]: I1011 04:12:00.173060 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8218e051a2319043e2dee048df8691a144aef362f9a430e39a9e3c9f6d918d21"} err="failed to get container status \"8218e051a2319043e2dee048df8691a144aef362f9a430e39a9e3c9f6d918d21\": rpc error: code = NotFound desc = could not find container \"8218e051a2319043e2dee048df8691a144aef362f9a430e39a9e3c9f6d918d21\": container with ID starting with 8218e051a2319043e2dee048df8691a144aef362f9a430e39a9e3c9f6d918d21 not found: ID does not exist" Oct 11 04:12:00 crc kubenswrapper[4703]: I1011 04:12:00.173314 4703 scope.go:117] "RemoveContainer" containerID="be06f9df2dadf0ffa8305196df064506ad37cb7b7b58c84de0dec5de6dab14f4" Oct 11 04:12:00 crc kubenswrapper[4703]: I1011 04:12:00.173372 4703 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"combined-ca-bundle" Oct 11 04:12:00 crc kubenswrapper[4703]: I1011 04:12:00.173667 4703 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-scripts" Oct 11 04:12:00 crc kubenswrapper[4703]: I1011 04:12:00.173849 4703 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-glance-dockercfg-dqx54" Oct 11 04:12:00 crc kubenswrapper[4703]: I1011 04:12:00.173989 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"be06f9df2dadf0ffa8305196df064506ad37cb7b7b58c84de0dec5de6dab14f4"} err="failed to get container status \"be06f9df2dadf0ffa8305196df064506ad37cb7b7b58c84de0dec5de6dab14f4\": rpc error: code = NotFound desc = could not find container \"be06f9df2dadf0ffa8305196df064506ad37cb7b7b58c84de0dec5de6dab14f4\": container with ID starting with be06f9df2dadf0ffa8305196df064506ad37cb7b7b58c84de0dec5de6dab14f4 not found: ID does not exist" Oct 11 04:12:00 crc kubenswrapper[4703]: I1011 04:12:00.174304 4703 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"cert-glance-default-internal-svc" Oct 11 04:12:00 crc kubenswrapper[4703]: I1011 04:12:00.174755 4703 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-default-single-config-data" Oct 11 04:12:00 crc kubenswrapper[4703]: I1011 04:12:00.175047 4703 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"cert-glance-default-public-svc" Oct 11 04:12:00 crc kubenswrapper[4703]: I1011 04:12:00.175915 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Oct 11 04:12:00 crc kubenswrapper[4703]: I1011 04:12:00.299324 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f0f64df5-124e-4345-b7d3-388ecb90eafd-public-tls-certs\") pod \"glance-default-single-0\" (UID: \"f0f64df5-124e-4345-b7d3-388ecb90eafd\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 11 04:12:00 crc kubenswrapper[4703]: I1011 04:12:00.299372 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-single-0\" (UID: \"f0f64df5-124e-4345-b7d3-388ecb90eafd\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 11 04:12:00 crc kubenswrapper[4703]: I1011 04:12:00.299400 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0f64df5-124e-4345-b7d3-388ecb90eafd-config-data\") pod \"glance-default-single-0\" (UID: \"f0f64df5-124e-4345-b7d3-388ecb90eafd\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 11 04:12:00 crc kubenswrapper[4703]: I1011 04:12:00.299415 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f0f64df5-124e-4345-b7d3-388ecb90eafd-httpd-run\") pod \"glance-default-single-0\" (UID: \"f0f64df5-124e-4345-b7d3-388ecb90eafd\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 11 04:12:00 crc kubenswrapper[4703]: I1011 04:12:00.299547 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f0f64df5-124e-4345-b7d3-388ecb90eafd-scripts\") pod \"glance-default-single-0\" (UID: \"f0f64df5-124e-4345-b7d3-388ecb90eafd\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 11 04:12:00 crc kubenswrapper[4703]: I1011 04:12:00.299725 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f0f64df5-124e-4345-b7d3-388ecb90eafd-logs\") pod \"glance-default-single-0\" (UID: \"f0f64df5-124e-4345-b7d3-388ecb90eafd\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 11 04:12:00 crc kubenswrapper[4703]: I1011 04:12:00.299914 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f0f64df5-124e-4345-b7d3-388ecb90eafd-internal-tls-certs\") pod \"glance-default-single-0\" (UID: \"f0f64df5-124e-4345-b7d3-388ecb90eafd\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 11 04:12:00 crc kubenswrapper[4703]: I1011 04:12:00.299957 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qqfwl\" (UniqueName: \"kubernetes.io/projected/f0f64df5-124e-4345-b7d3-388ecb90eafd-kube-api-access-qqfwl\") pod \"glance-default-single-0\" (UID: \"f0f64df5-124e-4345-b7d3-388ecb90eafd\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 11 04:12:00 crc kubenswrapper[4703]: I1011 04:12:00.300064 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0f64df5-124e-4345-b7d3-388ecb90eafd-combined-ca-bundle\") pod \"glance-default-single-0\" (UID: \"f0f64df5-124e-4345-b7d3-388ecb90eafd\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 11 04:12:00 crc kubenswrapper[4703]: I1011 04:12:00.401240 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qqfwl\" (UniqueName: \"kubernetes.io/projected/f0f64df5-124e-4345-b7d3-388ecb90eafd-kube-api-access-qqfwl\") pod \"glance-default-single-0\" (UID: \"f0f64df5-124e-4345-b7d3-388ecb90eafd\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 11 04:12:00 crc kubenswrapper[4703]: I1011 04:12:00.401320 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0f64df5-124e-4345-b7d3-388ecb90eafd-combined-ca-bundle\") pod \"glance-default-single-0\" (UID: \"f0f64df5-124e-4345-b7d3-388ecb90eafd\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 11 04:12:00 crc kubenswrapper[4703]: I1011 04:12:00.401375 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f0f64df5-124e-4345-b7d3-388ecb90eafd-public-tls-certs\") pod \"glance-default-single-0\" (UID: \"f0f64df5-124e-4345-b7d3-388ecb90eafd\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 11 04:12:00 crc kubenswrapper[4703]: I1011 04:12:00.401406 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-single-0\" (UID: \"f0f64df5-124e-4345-b7d3-388ecb90eafd\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 11 04:12:00 crc kubenswrapper[4703]: I1011 04:12:00.401445 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0f64df5-124e-4345-b7d3-388ecb90eafd-config-data\") pod \"glance-default-single-0\" (UID: \"f0f64df5-124e-4345-b7d3-388ecb90eafd\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 11 04:12:00 crc kubenswrapper[4703]: I1011 04:12:00.401483 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f0f64df5-124e-4345-b7d3-388ecb90eafd-httpd-run\") pod \"glance-default-single-0\" (UID: \"f0f64df5-124e-4345-b7d3-388ecb90eafd\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 11 04:12:00 crc kubenswrapper[4703]: I1011 04:12:00.401520 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f0f64df5-124e-4345-b7d3-388ecb90eafd-scripts\") pod \"glance-default-single-0\" (UID: \"f0f64df5-124e-4345-b7d3-388ecb90eafd\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 11 04:12:00 crc kubenswrapper[4703]: I1011 04:12:00.401564 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f0f64df5-124e-4345-b7d3-388ecb90eafd-logs\") pod \"glance-default-single-0\" (UID: \"f0f64df5-124e-4345-b7d3-388ecb90eafd\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 11 04:12:00 crc kubenswrapper[4703]: I1011 04:12:00.401605 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f0f64df5-124e-4345-b7d3-388ecb90eafd-internal-tls-certs\") pod \"glance-default-single-0\" (UID: \"f0f64df5-124e-4345-b7d3-388ecb90eafd\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 11 04:12:00 crc kubenswrapper[4703]: I1011 04:12:00.401766 4703 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-single-0\" (UID: \"f0f64df5-124e-4345-b7d3-388ecb90eafd\") device mount path \"/mnt/openstack/pv11\"" pod="glance-kuttl-tests/glance-default-single-0" Oct 11 04:12:00 crc kubenswrapper[4703]: I1011 04:12:00.402969 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f0f64df5-124e-4345-b7d3-388ecb90eafd-httpd-run\") pod \"glance-default-single-0\" (UID: \"f0f64df5-124e-4345-b7d3-388ecb90eafd\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 11 04:12:00 crc kubenswrapper[4703]: I1011 04:12:00.403438 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f0f64df5-124e-4345-b7d3-388ecb90eafd-logs\") pod \"glance-default-single-0\" (UID: \"f0f64df5-124e-4345-b7d3-388ecb90eafd\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 11 04:12:00 crc kubenswrapper[4703]: I1011 04:12:00.407766 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f0f64df5-124e-4345-b7d3-388ecb90eafd-scripts\") pod \"glance-default-single-0\" (UID: \"f0f64df5-124e-4345-b7d3-388ecb90eafd\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 11 04:12:00 crc kubenswrapper[4703]: I1011 04:12:00.408126 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0f64df5-124e-4345-b7d3-388ecb90eafd-combined-ca-bundle\") pod \"glance-default-single-0\" (UID: \"f0f64df5-124e-4345-b7d3-388ecb90eafd\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 11 04:12:00 crc kubenswrapper[4703]: I1011 04:12:00.408162 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f0f64df5-124e-4345-b7d3-388ecb90eafd-public-tls-certs\") pod \"glance-default-single-0\" (UID: \"f0f64df5-124e-4345-b7d3-388ecb90eafd\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 11 04:12:00 crc kubenswrapper[4703]: I1011 04:12:00.408300 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f0f64df5-124e-4345-b7d3-388ecb90eafd-internal-tls-certs\") pod \"glance-default-single-0\" (UID: \"f0f64df5-124e-4345-b7d3-388ecb90eafd\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 11 04:12:00 crc kubenswrapper[4703]: I1011 04:12:00.410739 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0f64df5-124e-4345-b7d3-388ecb90eafd-config-data\") pod \"glance-default-single-0\" (UID: \"f0f64df5-124e-4345-b7d3-388ecb90eafd\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 11 04:12:00 crc kubenswrapper[4703]: I1011 04:12:00.420658 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qqfwl\" (UniqueName: \"kubernetes.io/projected/f0f64df5-124e-4345-b7d3-388ecb90eafd-kube-api-access-qqfwl\") pod \"glance-default-single-0\" (UID: \"f0f64df5-124e-4345-b7d3-388ecb90eafd\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 11 04:12:00 crc kubenswrapper[4703]: I1011 04:12:00.426001 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-single-0\" (UID: \"f0f64df5-124e-4345-b7d3-388ecb90eafd\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 11 04:12:00 crc kubenswrapper[4703]: I1011 04:12:00.520317 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Oct 11 04:12:00 crc kubenswrapper[4703]: I1011 04:12:00.811769 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Oct 11 04:12:00 crc kubenswrapper[4703]: W1011 04:12:00.815457 4703 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf0f64df5_124e_4345_b7d3_388ecb90eafd.slice/crio-6bbb94f23ddea7a003ec6b0de0eff302b5f1435c5550804f455ebb17d98f0753 WatchSource:0}: Error finding container 6bbb94f23ddea7a003ec6b0de0eff302b5f1435c5550804f455ebb17d98f0753: Status 404 returned error can't find the container with id 6bbb94f23ddea7a003ec6b0de0eff302b5f1435c5550804f455ebb17d98f0753 Oct 11 04:12:01 crc kubenswrapper[4703]: I1011 04:12:01.077739 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"f0f64df5-124e-4345-b7d3-388ecb90eafd","Type":"ContainerStarted","Data":"6bbb94f23ddea7a003ec6b0de0eff302b5f1435c5550804f455ebb17d98f0753"} Oct 11 04:12:01 crc kubenswrapper[4703]: I1011 04:12:01.549349 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="58fa50a0-09ef-4b9c-b8ea-0104221c0581" path="/var/lib/kubelet/pods/58fa50a0-09ef-4b9c-b8ea-0104221c0581/volumes" Oct 11 04:12:02 crc kubenswrapper[4703]: I1011 04:12:02.089001 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"f0f64df5-124e-4345-b7d3-388ecb90eafd","Type":"ContainerStarted","Data":"ab8ebd5fc440f97ad5c10a04a5629909ae10ebb901870576ca0df827efdd63e8"} Oct 11 04:12:02 crc kubenswrapper[4703]: I1011 04:12:02.089056 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"f0f64df5-124e-4345-b7d3-388ecb90eafd","Type":"ContainerStarted","Data":"a4a4cec823666c3dd116f2289bac0f5de0ff29334d51de99d6122c3f3f277946"} Oct 11 04:12:02 crc kubenswrapper[4703]: I1011 04:12:02.122182 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-single-0" podStartSLOduration=2.122160509 podStartE2EDuration="2.122160509s" podCreationTimestamp="2025-10-11 04:12:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 04:12:02.120445104 +0000 UTC m=+1013.330927046" watchObservedRunningTime="2025-10-11 04:12:02.122160509 +0000 UTC m=+1013.332642431" Oct 11 04:12:10 crc kubenswrapper[4703]: I1011 04:12:10.520813 4703 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-single-0" Oct 11 04:12:10 crc kubenswrapper[4703]: I1011 04:12:10.521568 4703 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-single-0" Oct 11 04:12:10 crc kubenswrapper[4703]: I1011 04:12:10.553701 4703 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-single-0" Oct 11 04:12:10 crc kubenswrapper[4703]: I1011 04:12:10.571247 4703 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-single-0" Oct 11 04:12:11 crc kubenswrapper[4703]: I1011 04:12:11.170024 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-single-0" Oct 11 04:12:11 crc kubenswrapper[4703]: I1011 04:12:11.170367 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-single-0" Oct 11 04:12:13 crc kubenswrapper[4703]: I1011 04:12:13.009250 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-single-0" Oct 11 04:12:13 crc kubenswrapper[4703]: I1011 04:12:13.139040 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-single-0" Oct 11 04:12:14 crc kubenswrapper[4703]: I1011 04:12:14.994332 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-db-sync-tcwg2"] Oct 11 04:12:15 crc kubenswrapper[4703]: I1011 04:12:15.000380 4703 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-db-sync-tcwg2"] Oct 11 04:12:15 crc kubenswrapper[4703]: I1011 04:12:15.026764 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance2e4d-account-delete-nbtzv"] Oct 11 04:12:15 crc kubenswrapper[4703]: I1011 04:12:15.028199 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance2e4d-account-delete-nbtzv" Oct 11 04:12:15 crc kubenswrapper[4703]: I1011 04:12:15.037938 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance2e4d-account-delete-nbtzv"] Oct 11 04:12:15 crc kubenswrapper[4703]: I1011 04:12:15.098363 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Oct 11 04:12:15 crc kubenswrapper[4703]: I1011 04:12:15.164284 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zp7gv\" (UniqueName: \"kubernetes.io/projected/e8f14596-799c-4b87-87f4-f5589685876e-kube-api-access-zp7gv\") pod \"glance2e4d-account-delete-nbtzv\" (UID: \"e8f14596-799c-4b87-87f4-f5589685876e\") " pod="glance-kuttl-tests/glance2e4d-account-delete-nbtzv" Oct 11 04:12:15 crc kubenswrapper[4703]: I1011 04:12:15.211071 4703 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-single-0" podUID="f0f64df5-124e-4345-b7d3-388ecb90eafd" containerName="glance-log" containerID="cri-o://a4a4cec823666c3dd116f2289bac0f5de0ff29334d51de99d6122c3f3f277946" gracePeriod=30 Oct 11 04:12:15 crc kubenswrapper[4703]: I1011 04:12:15.211109 4703 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-single-0" podUID="f0f64df5-124e-4345-b7d3-388ecb90eafd" containerName="glance-httpd" containerID="cri-o://ab8ebd5fc440f97ad5c10a04a5629909ae10ebb901870576ca0df827efdd63e8" gracePeriod=30 Oct 11 04:12:15 crc kubenswrapper[4703]: I1011 04:12:15.265434 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zp7gv\" (UniqueName: \"kubernetes.io/projected/e8f14596-799c-4b87-87f4-f5589685876e-kube-api-access-zp7gv\") pod \"glance2e4d-account-delete-nbtzv\" (UID: \"e8f14596-799c-4b87-87f4-f5589685876e\") " pod="glance-kuttl-tests/glance2e4d-account-delete-nbtzv" Oct 11 04:12:15 crc kubenswrapper[4703]: I1011 04:12:15.289167 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zp7gv\" (UniqueName: \"kubernetes.io/projected/e8f14596-799c-4b87-87f4-f5589685876e-kube-api-access-zp7gv\") pod \"glance2e4d-account-delete-nbtzv\" (UID: \"e8f14596-799c-4b87-87f4-f5589685876e\") " pod="glance-kuttl-tests/glance2e4d-account-delete-nbtzv" Oct 11 04:12:15 crc kubenswrapper[4703]: I1011 04:12:15.345932 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance2e4d-account-delete-nbtzv" Oct 11 04:12:15 crc kubenswrapper[4703]: I1011 04:12:15.541331 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="455b5b3f-2658-49e5-91ef-b6b687431c79" path="/var/lib/kubelet/pods/455b5b3f-2658-49e5-91ef-b6b687431c79/volumes" Oct 11 04:12:15 crc kubenswrapper[4703]: I1011 04:12:15.787085 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance2e4d-account-delete-nbtzv"] Oct 11 04:12:16 crc kubenswrapper[4703]: I1011 04:12:16.239567 4703 generic.go:334] "Generic (PLEG): container finished" podID="f0f64df5-124e-4345-b7d3-388ecb90eafd" containerID="a4a4cec823666c3dd116f2289bac0f5de0ff29334d51de99d6122c3f3f277946" exitCode=143 Oct 11 04:12:16 crc kubenswrapper[4703]: I1011 04:12:16.239737 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"f0f64df5-124e-4345-b7d3-388ecb90eafd","Type":"ContainerDied","Data":"a4a4cec823666c3dd116f2289bac0f5de0ff29334d51de99d6122c3f3f277946"} Oct 11 04:12:16 crc kubenswrapper[4703]: I1011 04:12:16.247901 4703 generic.go:334] "Generic (PLEG): container finished" podID="e8f14596-799c-4b87-87f4-f5589685876e" containerID="764b62471bcf937100a0c27282fc1feabbb399465d03451a1447945ee99f0f2d" exitCode=0 Oct 11 04:12:16 crc kubenswrapper[4703]: I1011 04:12:16.247957 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance2e4d-account-delete-nbtzv" event={"ID":"e8f14596-799c-4b87-87f4-f5589685876e","Type":"ContainerDied","Data":"764b62471bcf937100a0c27282fc1feabbb399465d03451a1447945ee99f0f2d"} Oct 11 04:12:16 crc kubenswrapper[4703]: I1011 04:12:16.247990 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance2e4d-account-delete-nbtzv" event={"ID":"e8f14596-799c-4b87-87f4-f5589685876e","Type":"ContainerStarted","Data":"4ff1070eff7b66ddba00e9f36e515296b01c8161d3e63f35a7622ec765ab8ed3"} Oct 11 04:12:17 crc kubenswrapper[4703]: I1011 04:12:17.596360 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance2e4d-account-delete-nbtzv" Oct 11 04:12:17 crc kubenswrapper[4703]: I1011 04:12:17.701717 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zp7gv\" (UniqueName: \"kubernetes.io/projected/e8f14596-799c-4b87-87f4-f5589685876e-kube-api-access-zp7gv\") pod \"e8f14596-799c-4b87-87f4-f5589685876e\" (UID: \"e8f14596-799c-4b87-87f4-f5589685876e\") " Oct 11 04:12:17 crc kubenswrapper[4703]: I1011 04:12:17.708690 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e8f14596-799c-4b87-87f4-f5589685876e-kube-api-access-zp7gv" (OuterVolumeSpecName: "kube-api-access-zp7gv") pod "e8f14596-799c-4b87-87f4-f5589685876e" (UID: "e8f14596-799c-4b87-87f4-f5589685876e"). InnerVolumeSpecName "kube-api-access-zp7gv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 04:12:17 crc kubenswrapper[4703]: I1011 04:12:17.804255 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zp7gv\" (UniqueName: \"kubernetes.io/projected/e8f14596-799c-4b87-87f4-f5589685876e-kube-api-access-zp7gv\") on node \"crc\" DevicePath \"\"" Oct 11 04:12:18 crc kubenswrapper[4703]: I1011 04:12:18.264248 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance2e4d-account-delete-nbtzv" event={"ID":"e8f14596-799c-4b87-87f4-f5589685876e","Type":"ContainerDied","Data":"4ff1070eff7b66ddba00e9f36e515296b01c8161d3e63f35a7622ec765ab8ed3"} Oct 11 04:12:18 crc kubenswrapper[4703]: I1011 04:12:18.264286 4703 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4ff1070eff7b66ddba00e9f36e515296b01c8161d3e63f35a7622ec765ab8ed3" Oct 11 04:12:18 crc kubenswrapper[4703]: I1011 04:12:18.264657 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance2e4d-account-delete-nbtzv" Oct 11 04:12:18 crc kubenswrapper[4703]: I1011 04:12:18.758703 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Oct 11 04:12:18 crc kubenswrapper[4703]: I1011 04:12:18.819229 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0f64df5-124e-4345-b7d3-388ecb90eafd-config-data\") pod \"f0f64df5-124e-4345-b7d3-388ecb90eafd\" (UID: \"f0f64df5-124e-4345-b7d3-388ecb90eafd\") " Oct 11 04:12:18 crc kubenswrapper[4703]: I1011 04:12:18.819353 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f0f64df5-124e-4345-b7d3-388ecb90eafd-httpd-run\") pod \"f0f64df5-124e-4345-b7d3-388ecb90eafd\" (UID: \"f0f64df5-124e-4345-b7d3-388ecb90eafd\") " Oct 11 04:12:18 crc kubenswrapper[4703]: I1011 04:12:18.819402 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0f64df5-124e-4345-b7d3-388ecb90eafd-combined-ca-bundle\") pod \"f0f64df5-124e-4345-b7d3-388ecb90eafd\" (UID: \"f0f64df5-124e-4345-b7d3-388ecb90eafd\") " Oct 11 04:12:18 crc kubenswrapper[4703]: I1011 04:12:18.819554 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f0f64df5-124e-4345-b7d3-388ecb90eafd-scripts\") pod \"f0f64df5-124e-4345-b7d3-388ecb90eafd\" (UID: \"f0f64df5-124e-4345-b7d3-388ecb90eafd\") " Oct 11 04:12:18 crc kubenswrapper[4703]: I1011 04:12:18.819636 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f0f64df5-124e-4345-b7d3-388ecb90eafd-logs\") pod \"f0f64df5-124e-4345-b7d3-388ecb90eafd\" (UID: \"f0f64df5-124e-4345-b7d3-388ecb90eafd\") " Oct 11 04:12:18 crc kubenswrapper[4703]: I1011 04:12:18.819689 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f0f64df5-124e-4345-b7d3-388ecb90eafd-internal-tls-certs\") pod \"f0f64df5-124e-4345-b7d3-388ecb90eafd\" (UID: \"f0f64df5-124e-4345-b7d3-388ecb90eafd\") " Oct 11 04:12:18 crc kubenswrapper[4703]: I1011 04:12:18.819765 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f0f64df5-124e-4345-b7d3-388ecb90eafd-public-tls-certs\") pod \"f0f64df5-124e-4345-b7d3-388ecb90eafd\" (UID: \"f0f64df5-124e-4345-b7d3-388ecb90eafd\") " Oct 11 04:12:18 crc kubenswrapper[4703]: I1011 04:12:18.819882 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qqfwl\" (UniqueName: \"kubernetes.io/projected/f0f64df5-124e-4345-b7d3-388ecb90eafd-kube-api-access-qqfwl\") pod \"f0f64df5-124e-4345-b7d3-388ecb90eafd\" (UID: \"f0f64df5-124e-4345-b7d3-388ecb90eafd\") " Oct 11 04:12:18 crc kubenswrapper[4703]: I1011 04:12:18.819943 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"f0f64df5-124e-4345-b7d3-388ecb90eafd\" (UID: \"f0f64df5-124e-4345-b7d3-388ecb90eafd\") " Oct 11 04:12:18 crc kubenswrapper[4703]: I1011 04:12:18.820273 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f0f64df5-124e-4345-b7d3-388ecb90eafd-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "f0f64df5-124e-4345-b7d3-388ecb90eafd" (UID: "f0f64df5-124e-4345-b7d3-388ecb90eafd"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 04:12:18 crc kubenswrapper[4703]: I1011 04:12:18.820366 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f0f64df5-124e-4345-b7d3-388ecb90eafd-logs" (OuterVolumeSpecName: "logs") pod "f0f64df5-124e-4345-b7d3-388ecb90eafd" (UID: "f0f64df5-124e-4345-b7d3-388ecb90eafd"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 04:12:18 crc kubenswrapper[4703]: I1011 04:12:18.820786 4703 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f0f64df5-124e-4345-b7d3-388ecb90eafd-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 11 04:12:18 crc kubenswrapper[4703]: I1011 04:12:18.820810 4703 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f0f64df5-124e-4345-b7d3-388ecb90eafd-logs\") on node \"crc\" DevicePath \"\"" Oct 11 04:12:18 crc kubenswrapper[4703]: I1011 04:12:18.826951 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0f64df5-124e-4345-b7d3-388ecb90eafd-scripts" (OuterVolumeSpecName: "scripts") pod "f0f64df5-124e-4345-b7d3-388ecb90eafd" (UID: "f0f64df5-124e-4345-b7d3-388ecb90eafd"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 04:12:18 crc kubenswrapper[4703]: I1011 04:12:18.826999 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f0f64df5-124e-4345-b7d3-388ecb90eafd-kube-api-access-qqfwl" (OuterVolumeSpecName: "kube-api-access-qqfwl") pod "f0f64df5-124e-4345-b7d3-388ecb90eafd" (UID: "f0f64df5-124e-4345-b7d3-388ecb90eafd"). InnerVolumeSpecName "kube-api-access-qqfwl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 04:12:18 crc kubenswrapper[4703]: I1011 04:12:18.836646 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "glance") pod "f0f64df5-124e-4345-b7d3-388ecb90eafd" (UID: "f0f64df5-124e-4345-b7d3-388ecb90eafd"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 11 04:12:18 crc kubenswrapper[4703]: I1011 04:12:18.846541 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0f64df5-124e-4345-b7d3-388ecb90eafd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f0f64df5-124e-4345-b7d3-388ecb90eafd" (UID: "f0f64df5-124e-4345-b7d3-388ecb90eafd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 04:12:18 crc kubenswrapper[4703]: I1011 04:12:18.858405 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0f64df5-124e-4345-b7d3-388ecb90eafd-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "f0f64df5-124e-4345-b7d3-388ecb90eafd" (UID: "f0f64df5-124e-4345-b7d3-388ecb90eafd"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 04:12:18 crc kubenswrapper[4703]: I1011 04:12:18.859253 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0f64df5-124e-4345-b7d3-388ecb90eafd-config-data" (OuterVolumeSpecName: "config-data") pod "f0f64df5-124e-4345-b7d3-388ecb90eafd" (UID: "f0f64df5-124e-4345-b7d3-388ecb90eafd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 04:12:18 crc kubenswrapper[4703]: I1011 04:12:18.872720 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0f64df5-124e-4345-b7d3-388ecb90eafd-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "f0f64df5-124e-4345-b7d3-388ecb90eafd" (UID: "f0f64df5-124e-4345-b7d3-388ecb90eafd"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 04:12:18 crc kubenswrapper[4703]: I1011 04:12:18.922289 4703 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0f64df5-124e-4345-b7d3-388ecb90eafd-config-data\") on node \"crc\" DevicePath \"\"" Oct 11 04:12:18 crc kubenswrapper[4703]: I1011 04:12:18.922316 4703 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0f64df5-124e-4345-b7d3-388ecb90eafd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 11 04:12:18 crc kubenswrapper[4703]: I1011 04:12:18.922326 4703 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f0f64df5-124e-4345-b7d3-388ecb90eafd-scripts\") on node \"crc\" DevicePath \"\"" Oct 11 04:12:18 crc kubenswrapper[4703]: I1011 04:12:18.922334 4703 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f0f64df5-124e-4345-b7d3-388ecb90eafd-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 11 04:12:18 crc kubenswrapper[4703]: I1011 04:12:18.922344 4703 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f0f64df5-124e-4345-b7d3-388ecb90eafd-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 11 04:12:18 crc kubenswrapper[4703]: I1011 04:12:18.922353 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qqfwl\" (UniqueName: \"kubernetes.io/projected/f0f64df5-124e-4345-b7d3-388ecb90eafd-kube-api-access-qqfwl\") on node \"crc\" DevicePath \"\"" Oct 11 04:12:18 crc kubenswrapper[4703]: I1011 04:12:18.922383 4703 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Oct 11 04:12:18 crc kubenswrapper[4703]: I1011 04:12:18.937822 4703 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Oct 11 04:12:19 crc kubenswrapper[4703]: I1011 04:12:19.023682 4703 reconciler_common.go:293] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Oct 11 04:12:19 crc kubenswrapper[4703]: I1011 04:12:19.273104 4703 generic.go:334] "Generic (PLEG): container finished" podID="f0f64df5-124e-4345-b7d3-388ecb90eafd" containerID="ab8ebd5fc440f97ad5c10a04a5629909ae10ebb901870576ca0df827efdd63e8" exitCode=0 Oct 11 04:12:19 crc kubenswrapper[4703]: I1011 04:12:19.273146 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"f0f64df5-124e-4345-b7d3-388ecb90eafd","Type":"ContainerDied","Data":"ab8ebd5fc440f97ad5c10a04a5629909ae10ebb901870576ca0df827efdd63e8"} Oct 11 04:12:19 crc kubenswrapper[4703]: I1011 04:12:19.273193 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Oct 11 04:12:19 crc kubenswrapper[4703]: I1011 04:12:19.273220 4703 scope.go:117] "RemoveContainer" containerID="ab8ebd5fc440f97ad5c10a04a5629909ae10ebb901870576ca0df827efdd63e8" Oct 11 04:12:19 crc kubenswrapper[4703]: I1011 04:12:19.273202 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"f0f64df5-124e-4345-b7d3-388ecb90eafd","Type":"ContainerDied","Data":"6bbb94f23ddea7a003ec6b0de0eff302b5f1435c5550804f455ebb17d98f0753"} Oct 11 04:12:19 crc kubenswrapper[4703]: I1011 04:12:19.298518 4703 scope.go:117] "RemoveContainer" containerID="a4a4cec823666c3dd116f2289bac0f5de0ff29334d51de99d6122c3f3f277946" Oct 11 04:12:19 crc kubenswrapper[4703]: I1011 04:12:19.312980 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Oct 11 04:12:19 crc kubenswrapper[4703]: I1011 04:12:19.317834 4703 scope.go:117] "RemoveContainer" containerID="ab8ebd5fc440f97ad5c10a04a5629909ae10ebb901870576ca0df827efdd63e8" Oct 11 04:12:19 crc kubenswrapper[4703]: E1011 04:12:19.318293 4703 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ab8ebd5fc440f97ad5c10a04a5629909ae10ebb901870576ca0df827efdd63e8\": container with ID starting with ab8ebd5fc440f97ad5c10a04a5629909ae10ebb901870576ca0df827efdd63e8 not found: ID does not exist" containerID="ab8ebd5fc440f97ad5c10a04a5629909ae10ebb901870576ca0df827efdd63e8" Oct 11 04:12:19 crc kubenswrapper[4703]: I1011 04:12:19.318335 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ab8ebd5fc440f97ad5c10a04a5629909ae10ebb901870576ca0df827efdd63e8"} err="failed to get container status \"ab8ebd5fc440f97ad5c10a04a5629909ae10ebb901870576ca0df827efdd63e8\": rpc error: code = NotFound desc = could not find container \"ab8ebd5fc440f97ad5c10a04a5629909ae10ebb901870576ca0df827efdd63e8\": container with ID starting with ab8ebd5fc440f97ad5c10a04a5629909ae10ebb901870576ca0df827efdd63e8 not found: ID does not exist" Oct 11 04:12:19 crc kubenswrapper[4703]: I1011 04:12:19.318363 4703 scope.go:117] "RemoveContainer" containerID="a4a4cec823666c3dd116f2289bac0f5de0ff29334d51de99d6122c3f3f277946" Oct 11 04:12:19 crc kubenswrapper[4703]: E1011 04:12:19.318788 4703 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a4a4cec823666c3dd116f2289bac0f5de0ff29334d51de99d6122c3f3f277946\": container with ID starting with a4a4cec823666c3dd116f2289bac0f5de0ff29334d51de99d6122c3f3f277946 not found: ID does not exist" containerID="a4a4cec823666c3dd116f2289bac0f5de0ff29334d51de99d6122c3f3f277946" Oct 11 04:12:19 crc kubenswrapper[4703]: I1011 04:12:19.318828 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a4a4cec823666c3dd116f2289bac0f5de0ff29334d51de99d6122c3f3f277946"} err="failed to get container status \"a4a4cec823666c3dd116f2289bac0f5de0ff29334d51de99d6122c3f3f277946\": rpc error: code = NotFound desc = could not find container \"a4a4cec823666c3dd116f2289bac0f5de0ff29334d51de99d6122c3f3f277946\": container with ID starting with a4a4cec823666c3dd116f2289bac0f5de0ff29334d51de99d6122c3f3f277946 not found: ID does not exist" Oct 11 04:12:19 crc kubenswrapper[4703]: I1011 04:12:19.320754 4703 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Oct 11 04:12:19 crc kubenswrapper[4703]: I1011 04:12:19.549823 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f0f64df5-124e-4345-b7d3-388ecb90eafd" path="/var/lib/kubelet/pods/f0f64df5-124e-4345-b7d3-388ecb90eafd/volumes" Oct 11 04:12:20 crc kubenswrapper[4703]: I1011 04:12:20.053628 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-db-create-j6pwn"] Oct 11 04:12:20 crc kubenswrapper[4703]: I1011 04:12:20.060661 4703 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-db-create-j6pwn"] Oct 11 04:12:20 crc kubenswrapper[4703]: I1011 04:12:20.068277 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-2e4d-account-create-pgl4w"] Oct 11 04:12:20 crc kubenswrapper[4703]: I1011 04:12:20.073663 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance2e4d-account-delete-nbtzv"] Oct 11 04:12:20 crc kubenswrapper[4703]: I1011 04:12:20.079784 4703 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-2e4d-account-create-pgl4w"] Oct 11 04:12:20 crc kubenswrapper[4703]: I1011 04:12:20.085170 4703 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance2e4d-account-delete-nbtzv"] Oct 11 04:12:20 crc kubenswrapper[4703]: I1011 04:12:20.843288 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-db-create-l7tvf"] Oct 11 04:12:20 crc kubenswrapper[4703]: E1011 04:12:20.843718 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8f14596-799c-4b87-87f4-f5589685876e" containerName="mariadb-account-delete" Oct 11 04:12:20 crc kubenswrapper[4703]: I1011 04:12:20.843738 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8f14596-799c-4b87-87f4-f5589685876e" containerName="mariadb-account-delete" Oct 11 04:12:20 crc kubenswrapper[4703]: E1011 04:12:20.843762 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0f64df5-124e-4345-b7d3-388ecb90eafd" containerName="glance-log" Oct 11 04:12:20 crc kubenswrapper[4703]: I1011 04:12:20.843770 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0f64df5-124e-4345-b7d3-388ecb90eafd" containerName="glance-log" Oct 11 04:12:20 crc kubenswrapper[4703]: E1011 04:12:20.843786 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0f64df5-124e-4345-b7d3-388ecb90eafd" containerName="glance-httpd" Oct 11 04:12:20 crc kubenswrapper[4703]: I1011 04:12:20.843806 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0f64df5-124e-4345-b7d3-388ecb90eafd" containerName="glance-httpd" Oct 11 04:12:20 crc kubenswrapper[4703]: I1011 04:12:20.843961 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="f0f64df5-124e-4345-b7d3-388ecb90eafd" containerName="glance-log" Oct 11 04:12:20 crc kubenswrapper[4703]: I1011 04:12:20.843981 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="f0f64df5-124e-4345-b7d3-388ecb90eafd" containerName="glance-httpd" Oct 11 04:12:20 crc kubenswrapper[4703]: I1011 04:12:20.843999 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="e8f14596-799c-4b87-87f4-f5589685876e" containerName="mariadb-account-delete" Oct 11 04:12:20 crc kubenswrapper[4703]: I1011 04:12:20.844555 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-create-l7tvf" Oct 11 04:12:20 crc kubenswrapper[4703]: I1011 04:12:20.854149 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-create-l7tvf"] Oct 11 04:12:20 crc kubenswrapper[4703]: I1011 04:12:20.954019 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6xm8v\" (UniqueName: \"kubernetes.io/projected/536e9988-8d28-4fe8-8528-5857c7394d83-kube-api-access-6xm8v\") pod \"glance-db-create-l7tvf\" (UID: \"536e9988-8d28-4fe8-8528-5857c7394d83\") " pod="glance-kuttl-tests/glance-db-create-l7tvf" Oct 11 04:12:21 crc kubenswrapper[4703]: I1011 04:12:21.056033 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6xm8v\" (UniqueName: \"kubernetes.io/projected/536e9988-8d28-4fe8-8528-5857c7394d83-kube-api-access-6xm8v\") pod \"glance-db-create-l7tvf\" (UID: \"536e9988-8d28-4fe8-8528-5857c7394d83\") " pod="glance-kuttl-tests/glance-db-create-l7tvf" Oct 11 04:12:21 crc kubenswrapper[4703]: I1011 04:12:21.082273 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6xm8v\" (UniqueName: \"kubernetes.io/projected/536e9988-8d28-4fe8-8528-5857c7394d83-kube-api-access-6xm8v\") pod \"glance-db-create-l7tvf\" (UID: \"536e9988-8d28-4fe8-8528-5857c7394d83\") " pod="glance-kuttl-tests/glance-db-create-l7tvf" Oct 11 04:12:21 crc kubenswrapper[4703]: I1011 04:12:21.201824 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-create-l7tvf" Oct 11 04:12:21 crc kubenswrapper[4703]: I1011 04:12:21.545214 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3afca6b0-45f3-460d-864b-b0de72f2e817" path="/var/lib/kubelet/pods/3afca6b0-45f3-460d-864b-b0de72f2e817/volumes" Oct 11 04:12:21 crc kubenswrapper[4703]: I1011 04:12:21.546399 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="804cac9a-99d0-40cc-9251-627acf580d2a" path="/var/lib/kubelet/pods/804cac9a-99d0-40cc-9251-627acf580d2a/volumes" Oct 11 04:12:21 crc kubenswrapper[4703]: I1011 04:12:21.547294 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e8f14596-799c-4b87-87f4-f5589685876e" path="/var/lib/kubelet/pods/e8f14596-799c-4b87-87f4-f5589685876e/volumes" Oct 11 04:12:21 crc kubenswrapper[4703]: I1011 04:12:21.658819 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-create-l7tvf"] Oct 11 04:12:21 crc kubenswrapper[4703]: W1011 04:12:21.661885 4703 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod536e9988_8d28_4fe8_8528_5857c7394d83.slice/crio-62cf22f07c12113f32440be5b286cf2e3f7bbc8f427ebbe5e23b0d4add82e873 WatchSource:0}: Error finding container 62cf22f07c12113f32440be5b286cf2e3f7bbc8f427ebbe5e23b0d4add82e873: Status 404 returned error can't find the container with id 62cf22f07c12113f32440be5b286cf2e3f7bbc8f427ebbe5e23b0d4add82e873 Oct 11 04:12:22 crc kubenswrapper[4703]: I1011 04:12:22.335691 4703 generic.go:334] "Generic (PLEG): container finished" podID="536e9988-8d28-4fe8-8528-5857c7394d83" containerID="6b40bace9527f1e967f7699094b3b7ce6e347984d02120b19bf545f47e9b59b4" exitCode=0 Oct 11 04:12:22 crc kubenswrapper[4703]: I1011 04:12:22.335817 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-create-l7tvf" event={"ID":"536e9988-8d28-4fe8-8528-5857c7394d83","Type":"ContainerDied","Data":"6b40bace9527f1e967f7699094b3b7ce6e347984d02120b19bf545f47e9b59b4"} Oct 11 04:12:22 crc kubenswrapper[4703]: I1011 04:12:22.336026 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-create-l7tvf" event={"ID":"536e9988-8d28-4fe8-8528-5857c7394d83","Type":"ContainerStarted","Data":"62cf22f07c12113f32440be5b286cf2e3f7bbc8f427ebbe5e23b0d4add82e873"} Oct 11 04:12:23 crc kubenswrapper[4703]: I1011 04:12:23.654047 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-create-l7tvf" Oct 11 04:12:23 crc kubenswrapper[4703]: I1011 04:12:23.797861 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6xm8v\" (UniqueName: \"kubernetes.io/projected/536e9988-8d28-4fe8-8528-5857c7394d83-kube-api-access-6xm8v\") pod \"536e9988-8d28-4fe8-8528-5857c7394d83\" (UID: \"536e9988-8d28-4fe8-8528-5857c7394d83\") " Oct 11 04:12:23 crc kubenswrapper[4703]: I1011 04:12:23.802871 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/536e9988-8d28-4fe8-8528-5857c7394d83-kube-api-access-6xm8v" (OuterVolumeSpecName: "kube-api-access-6xm8v") pod "536e9988-8d28-4fe8-8528-5857c7394d83" (UID: "536e9988-8d28-4fe8-8528-5857c7394d83"). InnerVolumeSpecName "kube-api-access-6xm8v". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 04:12:23 crc kubenswrapper[4703]: I1011 04:12:23.900073 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6xm8v\" (UniqueName: \"kubernetes.io/projected/536e9988-8d28-4fe8-8528-5857c7394d83-kube-api-access-6xm8v\") on node \"crc\" DevicePath \"\"" Oct 11 04:12:24 crc kubenswrapper[4703]: I1011 04:12:24.357510 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-create-l7tvf" event={"ID":"536e9988-8d28-4fe8-8528-5857c7394d83","Type":"ContainerDied","Data":"62cf22f07c12113f32440be5b286cf2e3f7bbc8f427ebbe5e23b0d4add82e873"} Oct 11 04:12:24 crc kubenswrapper[4703]: I1011 04:12:24.357783 4703 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="62cf22f07c12113f32440be5b286cf2e3f7bbc8f427ebbe5e23b0d4add82e873" Oct 11 04:12:24 crc kubenswrapper[4703]: I1011 04:12:24.357590 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-create-l7tvf" Oct 11 04:12:30 crc kubenswrapper[4703]: I1011 04:12:30.897873 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-a5f8-account-create-9t85t"] Oct 11 04:12:30 crc kubenswrapper[4703]: E1011 04:12:30.898804 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="536e9988-8d28-4fe8-8528-5857c7394d83" containerName="mariadb-database-create" Oct 11 04:12:30 crc kubenswrapper[4703]: I1011 04:12:30.898819 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="536e9988-8d28-4fe8-8528-5857c7394d83" containerName="mariadb-database-create" Oct 11 04:12:30 crc kubenswrapper[4703]: I1011 04:12:30.899001 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="536e9988-8d28-4fe8-8528-5857c7394d83" containerName="mariadb-database-create" Oct 11 04:12:30 crc kubenswrapper[4703]: I1011 04:12:30.899535 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-a5f8-account-create-9t85t" Oct 11 04:12:30 crc kubenswrapper[4703]: I1011 04:12:30.902660 4703 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-db-secret" Oct 11 04:12:30 crc kubenswrapper[4703]: I1011 04:12:30.913145 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-a5f8-account-create-9t85t"] Oct 11 04:12:31 crc kubenswrapper[4703]: I1011 04:12:31.028323 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-29g2v\" (UniqueName: \"kubernetes.io/projected/37f46686-d548-480b-871b-208b94805891-kube-api-access-29g2v\") pod \"glance-a5f8-account-create-9t85t\" (UID: \"37f46686-d548-480b-871b-208b94805891\") " pod="glance-kuttl-tests/glance-a5f8-account-create-9t85t" Oct 11 04:12:31 crc kubenswrapper[4703]: I1011 04:12:31.130131 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-29g2v\" (UniqueName: \"kubernetes.io/projected/37f46686-d548-480b-871b-208b94805891-kube-api-access-29g2v\") pod \"glance-a5f8-account-create-9t85t\" (UID: \"37f46686-d548-480b-871b-208b94805891\") " pod="glance-kuttl-tests/glance-a5f8-account-create-9t85t" Oct 11 04:12:31 crc kubenswrapper[4703]: I1011 04:12:31.157422 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-29g2v\" (UniqueName: \"kubernetes.io/projected/37f46686-d548-480b-871b-208b94805891-kube-api-access-29g2v\") pod \"glance-a5f8-account-create-9t85t\" (UID: \"37f46686-d548-480b-871b-208b94805891\") " pod="glance-kuttl-tests/glance-a5f8-account-create-9t85t" Oct 11 04:12:31 crc kubenswrapper[4703]: I1011 04:12:31.259512 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-a5f8-account-create-9t85t" Oct 11 04:12:31 crc kubenswrapper[4703]: I1011 04:12:31.543871 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-a5f8-account-create-9t85t"] Oct 11 04:12:31 crc kubenswrapper[4703]: W1011 04:12:31.549559 4703 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod37f46686_d548_480b_871b_208b94805891.slice/crio-afb73b17341c84b1f7b48d3cd69e0000f422f1da18aa4bbd0555b2404107bfee WatchSource:0}: Error finding container afb73b17341c84b1f7b48d3cd69e0000f422f1da18aa4bbd0555b2404107bfee: Status 404 returned error can't find the container with id afb73b17341c84b1f7b48d3cd69e0000f422f1da18aa4bbd0555b2404107bfee Oct 11 04:12:32 crc kubenswrapper[4703]: I1011 04:12:32.436937 4703 generic.go:334] "Generic (PLEG): container finished" podID="37f46686-d548-480b-871b-208b94805891" containerID="f27f7440b3661cf8aef00464f6cc56d278bc936ea66a51dc12406018eeeedc85" exitCode=0 Oct 11 04:12:32 crc kubenswrapper[4703]: I1011 04:12:32.437001 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-a5f8-account-create-9t85t" event={"ID":"37f46686-d548-480b-871b-208b94805891","Type":"ContainerDied","Data":"f27f7440b3661cf8aef00464f6cc56d278bc936ea66a51dc12406018eeeedc85"} Oct 11 04:12:32 crc kubenswrapper[4703]: I1011 04:12:32.437039 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-a5f8-account-create-9t85t" event={"ID":"37f46686-d548-480b-871b-208b94805891","Type":"ContainerStarted","Data":"afb73b17341c84b1f7b48d3cd69e0000f422f1da18aa4bbd0555b2404107bfee"} Oct 11 04:12:33 crc kubenswrapper[4703]: I1011 04:12:33.817078 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-a5f8-account-create-9t85t" Oct 11 04:12:33 crc kubenswrapper[4703]: I1011 04:12:33.873148 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-29g2v\" (UniqueName: \"kubernetes.io/projected/37f46686-d548-480b-871b-208b94805891-kube-api-access-29g2v\") pod \"37f46686-d548-480b-871b-208b94805891\" (UID: \"37f46686-d548-480b-871b-208b94805891\") " Oct 11 04:12:33 crc kubenswrapper[4703]: I1011 04:12:33.878657 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/37f46686-d548-480b-871b-208b94805891-kube-api-access-29g2v" (OuterVolumeSpecName: "kube-api-access-29g2v") pod "37f46686-d548-480b-871b-208b94805891" (UID: "37f46686-d548-480b-871b-208b94805891"). InnerVolumeSpecName "kube-api-access-29g2v". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 04:12:33 crc kubenswrapper[4703]: I1011 04:12:33.975426 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-29g2v\" (UniqueName: \"kubernetes.io/projected/37f46686-d548-480b-871b-208b94805891-kube-api-access-29g2v\") on node \"crc\" DevicePath \"\"" Oct 11 04:12:34 crc kubenswrapper[4703]: I1011 04:12:34.460436 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-a5f8-account-create-9t85t" event={"ID":"37f46686-d548-480b-871b-208b94805891","Type":"ContainerDied","Data":"afb73b17341c84b1f7b48d3cd69e0000f422f1da18aa4bbd0555b2404107bfee"} Oct 11 04:12:34 crc kubenswrapper[4703]: I1011 04:12:34.460601 4703 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="afb73b17341c84b1f7b48d3cd69e0000f422f1da18aa4bbd0555b2404107bfee" Oct 11 04:12:34 crc kubenswrapper[4703]: I1011 04:12:34.460524 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-a5f8-account-create-9t85t" Oct 11 04:12:36 crc kubenswrapper[4703]: I1011 04:12:36.003961 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-db-sync-qxhrr"] Oct 11 04:12:36 crc kubenswrapper[4703]: E1011 04:12:36.004815 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37f46686-d548-480b-871b-208b94805891" containerName="mariadb-account-create" Oct 11 04:12:36 crc kubenswrapper[4703]: I1011 04:12:36.004837 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="37f46686-d548-480b-871b-208b94805891" containerName="mariadb-account-create" Oct 11 04:12:36 crc kubenswrapper[4703]: I1011 04:12:36.005240 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="37f46686-d548-480b-871b-208b94805891" containerName="mariadb-account-create" Oct 11 04:12:36 crc kubenswrapper[4703]: I1011 04:12:36.006221 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-qxhrr" Oct 11 04:12:36 crc kubenswrapper[4703]: I1011 04:12:36.009670 4703 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-config-data" Oct 11 04:12:36 crc kubenswrapper[4703]: I1011 04:12:36.010199 4703 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-glance-dockercfg-wcpzx" Oct 11 04:12:36 crc kubenswrapper[4703]: I1011 04:12:36.014025 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-sync-qxhrr"] Oct 11 04:12:36 crc kubenswrapper[4703]: I1011 04:12:36.108590 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/8738d157-5a11-42ce-afd3-edfadaa58f85-db-sync-config-data\") pod \"glance-db-sync-qxhrr\" (UID: \"8738d157-5a11-42ce-afd3-edfadaa58f85\") " pod="glance-kuttl-tests/glance-db-sync-qxhrr" Oct 11 04:12:36 crc kubenswrapper[4703]: I1011 04:12:36.108646 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8738d157-5a11-42ce-afd3-edfadaa58f85-config-data\") pod \"glance-db-sync-qxhrr\" (UID: \"8738d157-5a11-42ce-afd3-edfadaa58f85\") " pod="glance-kuttl-tests/glance-db-sync-qxhrr" Oct 11 04:12:36 crc kubenswrapper[4703]: I1011 04:12:36.108839 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dmqhv\" (UniqueName: \"kubernetes.io/projected/8738d157-5a11-42ce-afd3-edfadaa58f85-kube-api-access-dmqhv\") pod \"glance-db-sync-qxhrr\" (UID: \"8738d157-5a11-42ce-afd3-edfadaa58f85\") " pod="glance-kuttl-tests/glance-db-sync-qxhrr" Oct 11 04:12:36 crc kubenswrapper[4703]: I1011 04:12:36.210088 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/8738d157-5a11-42ce-afd3-edfadaa58f85-db-sync-config-data\") pod \"glance-db-sync-qxhrr\" (UID: \"8738d157-5a11-42ce-afd3-edfadaa58f85\") " pod="glance-kuttl-tests/glance-db-sync-qxhrr" Oct 11 04:12:36 crc kubenswrapper[4703]: I1011 04:12:36.210138 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8738d157-5a11-42ce-afd3-edfadaa58f85-config-data\") pod \"glance-db-sync-qxhrr\" (UID: \"8738d157-5a11-42ce-afd3-edfadaa58f85\") " pod="glance-kuttl-tests/glance-db-sync-qxhrr" Oct 11 04:12:36 crc kubenswrapper[4703]: I1011 04:12:36.210191 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dmqhv\" (UniqueName: \"kubernetes.io/projected/8738d157-5a11-42ce-afd3-edfadaa58f85-kube-api-access-dmqhv\") pod \"glance-db-sync-qxhrr\" (UID: \"8738d157-5a11-42ce-afd3-edfadaa58f85\") " pod="glance-kuttl-tests/glance-db-sync-qxhrr" Oct 11 04:12:36 crc kubenswrapper[4703]: I1011 04:12:36.214907 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8738d157-5a11-42ce-afd3-edfadaa58f85-config-data\") pod \"glance-db-sync-qxhrr\" (UID: \"8738d157-5a11-42ce-afd3-edfadaa58f85\") " pod="glance-kuttl-tests/glance-db-sync-qxhrr" Oct 11 04:12:36 crc kubenswrapper[4703]: I1011 04:12:36.216573 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/8738d157-5a11-42ce-afd3-edfadaa58f85-db-sync-config-data\") pod \"glance-db-sync-qxhrr\" (UID: \"8738d157-5a11-42ce-afd3-edfadaa58f85\") " pod="glance-kuttl-tests/glance-db-sync-qxhrr" Oct 11 04:12:36 crc kubenswrapper[4703]: I1011 04:12:36.231192 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dmqhv\" (UniqueName: \"kubernetes.io/projected/8738d157-5a11-42ce-afd3-edfadaa58f85-kube-api-access-dmqhv\") pod \"glance-db-sync-qxhrr\" (UID: \"8738d157-5a11-42ce-afd3-edfadaa58f85\") " pod="glance-kuttl-tests/glance-db-sync-qxhrr" Oct 11 04:12:36 crc kubenswrapper[4703]: I1011 04:12:36.330862 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-qxhrr" Oct 11 04:12:36 crc kubenswrapper[4703]: I1011 04:12:36.789023 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-sync-qxhrr"] Oct 11 04:12:37 crc kubenswrapper[4703]: I1011 04:12:37.484786 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-qxhrr" event={"ID":"8738d157-5a11-42ce-afd3-edfadaa58f85","Type":"ContainerStarted","Data":"9565ee4f72394006a67d97be215e7e6497dba9e9289761142ae08d108e8e6f9d"} Oct 11 04:12:37 crc kubenswrapper[4703]: I1011 04:12:37.485134 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-qxhrr" event={"ID":"8738d157-5a11-42ce-afd3-edfadaa58f85","Type":"ContainerStarted","Data":"b3d7a1419e570183f2e5ac3cc9d0c0157d70223daa87a0497c95eaef7be8b19c"} Oct 11 04:12:37 crc kubenswrapper[4703]: I1011 04:12:37.519375 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-db-sync-qxhrr" podStartSLOduration=2.5193559949999997 podStartE2EDuration="2.519355995s" podCreationTimestamp="2025-10-11 04:12:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 04:12:37.510721659 +0000 UTC m=+1048.721203601" watchObservedRunningTime="2025-10-11 04:12:37.519355995 +0000 UTC m=+1048.729837937" Oct 11 04:12:40 crc kubenswrapper[4703]: I1011 04:12:40.507408 4703 generic.go:334] "Generic (PLEG): container finished" podID="8738d157-5a11-42ce-afd3-edfadaa58f85" containerID="9565ee4f72394006a67d97be215e7e6497dba9e9289761142ae08d108e8e6f9d" exitCode=0 Oct 11 04:12:40 crc kubenswrapper[4703]: I1011 04:12:40.507538 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-qxhrr" event={"ID":"8738d157-5a11-42ce-afd3-edfadaa58f85","Type":"ContainerDied","Data":"9565ee4f72394006a67d97be215e7e6497dba9e9289761142ae08d108e8e6f9d"} Oct 11 04:12:41 crc kubenswrapper[4703]: I1011 04:12:41.807909 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-qxhrr" Oct 11 04:12:41 crc kubenswrapper[4703]: I1011 04:12:41.893193 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dmqhv\" (UniqueName: \"kubernetes.io/projected/8738d157-5a11-42ce-afd3-edfadaa58f85-kube-api-access-dmqhv\") pod \"8738d157-5a11-42ce-afd3-edfadaa58f85\" (UID: \"8738d157-5a11-42ce-afd3-edfadaa58f85\") " Oct 11 04:12:41 crc kubenswrapper[4703]: I1011 04:12:41.893242 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8738d157-5a11-42ce-afd3-edfadaa58f85-config-data\") pod \"8738d157-5a11-42ce-afd3-edfadaa58f85\" (UID: \"8738d157-5a11-42ce-afd3-edfadaa58f85\") " Oct 11 04:12:41 crc kubenswrapper[4703]: I1011 04:12:41.893279 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/8738d157-5a11-42ce-afd3-edfadaa58f85-db-sync-config-data\") pod \"8738d157-5a11-42ce-afd3-edfadaa58f85\" (UID: \"8738d157-5a11-42ce-afd3-edfadaa58f85\") " Oct 11 04:12:41 crc kubenswrapper[4703]: I1011 04:12:41.899493 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8738d157-5a11-42ce-afd3-edfadaa58f85-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "8738d157-5a11-42ce-afd3-edfadaa58f85" (UID: "8738d157-5a11-42ce-afd3-edfadaa58f85"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 04:12:41 crc kubenswrapper[4703]: I1011 04:12:41.899629 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8738d157-5a11-42ce-afd3-edfadaa58f85-kube-api-access-dmqhv" (OuterVolumeSpecName: "kube-api-access-dmqhv") pod "8738d157-5a11-42ce-afd3-edfadaa58f85" (UID: "8738d157-5a11-42ce-afd3-edfadaa58f85"). InnerVolumeSpecName "kube-api-access-dmqhv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 04:12:41 crc kubenswrapper[4703]: I1011 04:12:41.950749 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8738d157-5a11-42ce-afd3-edfadaa58f85-config-data" (OuterVolumeSpecName: "config-data") pod "8738d157-5a11-42ce-afd3-edfadaa58f85" (UID: "8738d157-5a11-42ce-afd3-edfadaa58f85"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 04:12:41 crc kubenswrapper[4703]: I1011 04:12:41.995266 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dmqhv\" (UniqueName: \"kubernetes.io/projected/8738d157-5a11-42ce-afd3-edfadaa58f85-kube-api-access-dmqhv\") on node \"crc\" DevicePath \"\"" Oct 11 04:12:41 crc kubenswrapper[4703]: I1011 04:12:41.995309 4703 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8738d157-5a11-42ce-afd3-edfadaa58f85-config-data\") on node \"crc\" DevicePath \"\"" Oct 11 04:12:41 crc kubenswrapper[4703]: I1011 04:12:41.995322 4703 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/8738d157-5a11-42ce-afd3-edfadaa58f85-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 11 04:12:42 crc kubenswrapper[4703]: I1011 04:12:42.526019 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-qxhrr" event={"ID":"8738d157-5a11-42ce-afd3-edfadaa58f85","Type":"ContainerDied","Data":"b3d7a1419e570183f2e5ac3cc9d0c0157d70223daa87a0497c95eaef7be8b19c"} Oct 11 04:12:42 crc kubenswrapper[4703]: I1011 04:12:42.526079 4703 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b3d7a1419e570183f2e5ac3cc9d0c0157d70223daa87a0497c95eaef7be8b19c" Oct 11 04:12:42 crc kubenswrapper[4703]: I1011 04:12:42.526102 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-qxhrr" Oct 11 04:12:43 crc kubenswrapper[4703]: I1011 04:12:43.732139 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-external-api-0"] Oct 11 04:12:43 crc kubenswrapper[4703]: E1011 04:12:43.732786 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8738d157-5a11-42ce-afd3-edfadaa58f85" containerName="glance-db-sync" Oct 11 04:12:43 crc kubenswrapper[4703]: I1011 04:12:43.732812 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="8738d157-5a11-42ce-afd3-edfadaa58f85" containerName="glance-db-sync" Oct 11 04:12:43 crc kubenswrapper[4703]: I1011 04:12:43.733056 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="8738d157-5a11-42ce-afd3-edfadaa58f85" containerName="glance-db-sync" Oct 11 04:12:43 crc kubenswrapper[4703]: I1011 04:12:43.734275 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-0" Oct 11 04:12:43 crc kubenswrapper[4703]: I1011 04:12:43.735790 4703 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-scripts" Oct 11 04:12:43 crc kubenswrapper[4703]: I1011 04:12:43.736424 4703 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-default-external-config-data" Oct 11 04:12:43 crc kubenswrapper[4703]: I1011 04:12:43.737627 4703 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-glance-dockercfg-wcpzx" Oct 11 04:12:43 crc kubenswrapper[4703]: I1011 04:12:43.746340 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-0"] Oct 11 04:12:43 crc kubenswrapper[4703]: I1011 04:12:43.822782 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/360f7d83-4ffd-4ee4-841d-76f2c50f7e7a-dev\") pod \"glance-default-external-api-0\" (UID: \"360f7d83-4ffd-4ee4-841d-76f2c50f7e7a\") " pod="glance-kuttl-tests/glance-default-external-api-0" Oct 11 04:12:43 crc kubenswrapper[4703]: I1011 04:12:43.822912 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/360f7d83-4ffd-4ee4-841d-76f2c50f7e7a-run\") pod \"glance-default-external-api-0\" (UID: \"360f7d83-4ffd-4ee4-841d-76f2c50f7e7a\") " pod="glance-kuttl-tests/glance-default-external-api-0" Oct 11 04:12:43 crc kubenswrapper[4703]: I1011 04:12:43.822939 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/360f7d83-4ffd-4ee4-841d-76f2c50f7e7a-etc-nvme\") pod \"glance-default-external-api-0\" (UID: \"360f7d83-4ffd-4ee4-841d-76f2c50f7e7a\") " pod="glance-kuttl-tests/glance-default-external-api-0" Oct 11 04:12:43 crc kubenswrapper[4703]: I1011 04:12:43.822967 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/360f7d83-4ffd-4ee4-841d-76f2c50f7e7a-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"360f7d83-4ffd-4ee4-841d-76f2c50f7e7a\") " pod="glance-kuttl-tests/glance-default-external-api-0" Oct 11 04:12:43 crc kubenswrapper[4703]: I1011 04:12:43.822995 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/360f7d83-4ffd-4ee4-841d-76f2c50f7e7a-sys\") pod \"glance-default-external-api-0\" (UID: \"360f7d83-4ffd-4ee4-841d-76f2c50f7e7a\") " pod="glance-kuttl-tests/glance-default-external-api-0" Oct 11 04:12:43 crc kubenswrapper[4703]: I1011 04:12:43.823012 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/360f7d83-4ffd-4ee4-841d-76f2c50f7e7a-etc-iscsi\") pod \"glance-default-external-api-0\" (UID: \"360f7d83-4ffd-4ee4-841d-76f2c50f7e7a\") " pod="glance-kuttl-tests/glance-default-external-api-0" Oct 11 04:12:43 crc kubenswrapper[4703]: I1011 04:12:43.823192 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"360f7d83-4ffd-4ee4-841d-76f2c50f7e7a\") " pod="glance-kuttl-tests/glance-default-external-api-0" Oct 11 04:12:43 crc kubenswrapper[4703]: I1011 04:12:43.823266 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/360f7d83-4ffd-4ee4-841d-76f2c50f7e7a-scripts\") pod \"glance-default-external-api-0\" (UID: \"360f7d83-4ffd-4ee4-841d-76f2c50f7e7a\") " pod="glance-kuttl-tests/glance-default-external-api-0" Oct 11 04:12:43 crc kubenswrapper[4703]: I1011 04:12:43.823339 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/360f7d83-4ffd-4ee4-841d-76f2c50f7e7a-lib-modules\") pod \"glance-default-external-api-0\" (UID: \"360f7d83-4ffd-4ee4-841d-76f2c50f7e7a\") " pod="glance-kuttl-tests/glance-default-external-api-0" Oct 11 04:12:43 crc kubenswrapper[4703]: I1011 04:12:43.823422 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"360f7d83-4ffd-4ee4-841d-76f2c50f7e7a\") " pod="glance-kuttl-tests/glance-default-external-api-0" Oct 11 04:12:43 crc kubenswrapper[4703]: I1011 04:12:43.823496 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/360f7d83-4ffd-4ee4-841d-76f2c50f7e7a-config-data\") pod \"glance-default-external-api-0\" (UID: \"360f7d83-4ffd-4ee4-841d-76f2c50f7e7a\") " pod="glance-kuttl-tests/glance-default-external-api-0" Oct 11 04:12:43 crc kubenswrapper[4703]: I1011 04:12:43.823533 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/360f7d83-4ffd-4ee4-841d-76f2c50f7e7a-var-locks-brick\") pod \"glance-default-external-api-0\" (UID: \"360f7d83-4ffd-4ee4-841d-76f2c50f7e7a\") " pod="glance-kuttl-tests/glance-default-external-api-0" Oct 11 04:12:43 crc kubenswrapper[4703]: I1011 04:12:43.823588 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pms2b\" (UniqueName: \"kubernetes.io/projected/360f7d83-4ffd-4ee4-841d-76f2c50f7e7a-kube-api-access-pms2b\") pod \"glance-default-external-api-0\" (UID: \"360f7d83-4ffd-4ee4-841d-76f2c50f7e7a\") " pod="glance-kuttl-tests/glance-default-external-api-0" Oct 11 04:12:43 crc kubenswrapper[4703]: I1011 04:12:43.823624 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/360f7d83-4ffd-4ee4-841d-76f2c50f7e7a-logs\") pod \"glance-default-external-api-0\" (UID: \"360f7d83-4ffd-4ee4-841d-76f2c50f7e7a\") " pod="glance-kuttl-tests/glance-default-external-api-0" Oct 11 04:12:43 crc kubenswrapper[4703]: I1011 04:12:43.824619 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Oct 11 04:12:43 crc kubenswrapper[4703]: I1011 04:12:43.825888 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-0" Oct 11 04:12:43 crc kubenswrapper[4703]: I1011 04:12:43.827562 4703 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-default-internal-config-data" Oct 11 04:12:43 crc kubenswrapper[4703]: I1011 04:12:43.848807 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Oct 11 04:12:43 crc kubenswrapper[4703]: I1011 04:12:43.924720 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pms2b\" (UniqueName: \"kubernetes.io/projected/360f7d83-4ffd-4ee4-841d-76f2c50f7e7a-kube-api-access-pms2b\") pod \"glance-default-external-api-0\" (UID: \"360f7d83-4ffd-4ee4-841d-76f2c50f7e7a\") " pod="glance-kuttl-tests/glance-default-external-api-0" Oct 11 04:12:43 crc kubenswrapper[4703]: I1011 04:12:43.924777 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/50a0634c-7268-4bc5-8c34-02fc4fdc9654-lib-modules\") pod \"glance-default-internal-api-0\" (UID: \"50a0634c-7268-4bc5-8c34-02fc4fdc9654\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Oct 11 04:12:43 crc kubenswrapper[4703]: I1011 04:12:43.924804 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/360f7d83-4ffd-4ee4-841d-76f2c50f7e7a-logs\") pod \"glance-default-external-api-0\" (UID: \"360f7d83-4ffd-4ee4-841d-76f2c50f7e7a\") " pod="glance-kuttl-tests/glance-default-external-api-0" Oct 11 04:12:43 crc kubenswrapper[4703]: I1011 04:12:43.924843 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/360f7d83-4ffd-4ee4-841d-76f2c50f7e7a-dev\") pod \"glance-default-external-api-0\" (UID: \"360f7d83-4ffd-4ee4-841d-76f2c50f7e7a\") " pod="glance-kuttl-tests/glance-default-external-api-0" Oct 11 04:12:43 crc kubenswrapper[4703]: I1011 04:12:43.924869 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/50a0634c-7268-4bc5-8c34-02fc4fdc9654-sys\") pod \"glance-default-internal-api-0\" (UID: \"50a0634c-7268-4bc5-8c34-02fc4fdc9654\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Oct 11 04:12:43 crc kubenswrapper[4703]: I1011 04:12:43.924900 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/50a0634c-7268-4bc5-8c34-02fc4fdc9654-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"50a0634c-7268-4bc5-8c34-02fc4fdc9654\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Oct 11 04:12:43 crc kubenswrapper[4703]: I1011 04:12:43.924927 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"50a0634c-7268-4bc5-8c34-02fc4fdc9654\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Oct 11 04:12:43 crc kubenswrapper[4703]: I1011 04:12:43.924947 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/50a0634c-7268-4bc5-8c34-02fc4fdc9654-run\") pod \"glance-default-internal-api-0\" (UID: \"50a0634c-7268-4bc5-8c34-02fc4fdc9654\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Oct 11 04:12:43 crc kubenswrapper[4703]: I1011 04:12:43.924969 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/360f7d83-4ffd-4ee4-841d-76f2c50f7e7a-run\") pod \"glance-default-external-api-0\" (UID: \"360f7d83-4ffd-4ee4-841d-76f2c50f7e7a\") " pod="glance-kuttl-tests/glance-default-external-api-0" Oct 11 04:12:43 crc kubenswrapper[4703]: I1011 04:12:43.924990 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/360f7d83-4ffd-4ee4-841d-76f2c50f7e7a-etc-nvme\") pod \"glance-default-external-api-0\" (UID: \"360f7d83-4ffd-4ee4-841d-76f2c50f7e7a\") " pod="glance-kuttl-tests/glance-default-external-api-0" Oct 11 04:12:43 crc kubenswrapper[4703]: I1011 04:12:43.925014 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/360f7d83-4ffd-4ee4-841d-76f2c50f7e7a-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"360f7d83-4ffd-4ee4-841d-76f2c50f7e7a\") " pod="glance-kuttl-tests/glance-default-external-api-0" Oct 11 04:12:43 crc kubenswrapper[4703]: I1011 04:12:43.925038 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/50a0634c-7268-4bc5-8c34-02fc4fdc9654-dev\") pod \"glance-default-internal-api-0\" (UID: \"50a0634c-7268-4bc5-8c34-02fc4fdc9654\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Oct 11 04:12:43 crc kubenswrapper[4703]: I1011 04:12:43.925068 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/360f7d83-4ffd-4ee4-841d-76f2c50f7e7a-sys\") pod \"glance-default-external-api-0\" (UID: \"360f7d83-4ffd-4ee4-841d-76f2c50f7e7a\") " pod="glance-kuttl-tests/glance-default-external-api-0" Oct 11 04:12:43 crc kubenswrapper[4703]: I1011 04:12:43.925091 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/50a0634c-7268-4bc5-8c34-02fc4fdc9654-var-locks-brick\") pod \"glance-default-internal-api-0\" (UID: \"50a0634c-7268-4bc5-8c34-02fc4fdc9654\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Oct 11 04:12:43 crc kubenswrapper[4703]: I1011 04:12:43.925114 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/360f7d83-4ffd-4ee4-841d-76f2c50f7e7a-etc-iscsi\") pod \"glance-default-external-api-0\" (UID: \"360f7d83-4ffd-4ee4-841d-76f2c50f7e7a\") " pod="glance-kuttl-tests/glance-default-external-api-0" Oct 11 04:12:43 crc kubenswrapper[4703]: I1011 04:12:43.925134 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/50a0634c-7268-4bc5-8c34-02fc4fdc9654-config-data\") pod \"glance-default-internal-api-0\" (UID: \"50a0634c-7268-4bc5-8c34-02fc4fdc9654\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Oct 11 04:12:43 crc kubenswrapper[4703]: I1011 04:12:43.925163 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"360f7d83-4ffd-4ee4-841d-76f2c50f7e7a\") " pod="glance-kuttl-tests/glance-default-external-api-0" Oct 11 04:12:43 crc kubenswrapper[4703]: I1011 04:12:43.925190 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/360f7d83-4ffd-4ee4-841d-76f2c50f7e7a-scripts\") pod \"glance-default-external-api-0\" (UID: \"360f7d83-4ffd-4ee4-841d-76f2c50f7e7a\") " pod="glance-kuttl-tests/glance-default-external-api-0" Oct 11 04:12:43 crc kubenswrapper[4703]: I1011 04:12:43.925221 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/360f7d83-4ffd-4ee4-841d-76f2c50f7e7a-lib-modules\") pod \"glance-default-external-api-0\" (UID: \"360f7d83-4ffd-4ee4-841d-76f2c50f7e7a\") " pod="glance-kuttl-tests/glance-default-external-api-0" Oct 11 04:12:43 crc kubenswrapper[4703]: I1011 04:12:43.925247 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/50a0634c-7268-4bc5-8c34-02fc4fdc9654-scripts\") pod \"glance-default-internal-api-0\" (UID: \"50a0634c-7268-4bc5-8c34-02fc4fdc9654\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Oct 11 04:12:43 crc kubenswrapper[4703]: I1011 04:12:43.925274 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"50a0634c-7268-4bc5-8c34-02fc4fdc9654\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Oct 11 04:12:43 crc kubenswrapper[4703]: I1011 04:12:43.925298 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6xclg\" (UniqueName: \"kubernetes.io/projected/50a0634c-7268-4bc5-8c34-02fc4fdc9654-kube-api-access-6xclg\") pod \"glance-default-internal-api-0\" (UID: \"50a0634c-7268-4bc5-8c34-02fc4fdc9654\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Oct 11 04:12:43 crc kubenswrapper[4703]: I1011 04:12:43.925328 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"360f7d83-4ffd-4ee4-841d-76f2c50f7e7a\") " pod="glance-kuttl-tests/glance-default-external-api-0" Oct 11 04:12:43 crc kubenswrapper[4703]: I1011 04:12:43.925351 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/360f7d83-4ffd-4ee4-841d-76f2c50f7e7a-config-data\") pod \"glance-default-external-api-0\" (UID: \"360f7d83-4ffd-4ee4-841d-76f2c50f7e7a\") " pod="glance-kuttl-tests/glance-default-external-api-0" Oct 11 04:12:43 crc kubenswrapper[4703]: I1011 04:12:43.925375 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/360f7d83-4ffd-4ee4-841d-76f2c50f7e7a-var-locks-brick\") pod \"glance-default-external-api-0\" (UID: \"360f7d83-4ffd-4ee4-841d-76f2c50f7e7a\") " pod="glance-kuttl-tests/glance-default-external-api-0" Oct 11 04:12:43 crc kubenswrapper[4703]: I1011 04:12:43.925398 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/50a0634c-7268-4bc5-8c34-02fc4fdc9654-etc-nvme\") pod \"glance-default-internal-api-0\" (UID: \"50a0634c-7268-4bc5-8c34-02fc4fdc9654\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Oct 11 04:12:43 crc kubenswrapper[4703]: I1011 04:12:43.925426 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/50a0634c-7268-4bc5-8c34-02fc4fdc9654-logs\") pod \"glance-default-internal-api-0\" (UID: \"50a0634c-7268-4bc5-8c34-02fc4fdc9654\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Oct 11 04:12:43 crc kubenswrapper[4703]: I1011 04:12:43.925445 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/50a0634c-7268-4bc5-8c34-02fc4fdc9654-etc-iscsi\") pod \"glance-default-internal-api-0\" (UID: \"50a0634c-7268-4bc5-8c34-02fc4fdc9654\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Oct 11 04:12:43 crc kubenswrapper[4703]: I1011 04:12:43.926139 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/360f7d83-4ffd-4ee4-841d-76f2c50f7e7a-etc-iscsi\") pod \"glance-default-external-api-0\" (UID: \"360f7d83-4ffd-4ee4-841d-76f2c50f7e7a\") " pod="glance-kuttl-tests/glance-default-external-api-0" Oct 11 04:12:43 crc kubenswrapper[4703]: I1011 04:12:43.926175 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/360f7d83-4ffd-4ee4-841d-76f2c50f7e7a-run\") pod \"glance-default-external-api-0\" (UID: \"360f7d83-4ffd-4ee4-841d-76f2c50f7e7a\") " pod="glance-kuttl-tests/glance-default-external-api-0" Oct 11 04:12:43 crc kubenswrapper[4703]: I1011 04:12:43.926227 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/360f7d83-4ffd-4ee4-841d-76f2c50f7e7a-etc-nvme\") pod \"glance-default-external-api-0\" (UID: \"360f7d83-4ffd-4ee4-841d-76f2c50f7e7a\") " pod="glance-kuttl-tests/glance-default-external-api-0" Oct 11 04:12:43 crc kubenswrapper[4703]: I1011 04:12:43.926256 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/360f7d83-4ffd-4ee4-841d-76f2c50f7e7a-dev\") pod \"glance-default-external-api-0\" (UID: \"360f7d83-4ffd-4ee4-841d-76f2c50f7e7a\") " pod="glance-kuttl-tests/glance-default-external-api-0" Oct 11 04:12:43 crc kubenswrapper[4703]: I1011 04:12:43.926298 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/360f7d83-4ffd-4ee4-841d-76f2c50f7e7a-logs\") pod \"glance-default-external-api-0\" (UID: \"360f7d83-4ffd-4ee4-841d-76f2c50f7e7a\") " pod="glance-kuttl-tests/glance-default-external-api-0" Oct 11 04:12:43 crc kubenswrapper[4703]: I1011 04:12:43.926387 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/360f7d83-4ffd-4ee4-841d-76f2c50f7e7a-var-locks-brick\") pod \"glance-default-external-api-0\" (UID: \"360f7d83-4ffd-4ee4-841d-76f2c50f7e7a\") " pod="glance-kuttl-tests/glance-default-external-api-0" Oct 11 04:12:43 crc kubenswrapper[4703]: I1011 04:12:43.926485 4703 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"360f7d83-4ffd-4ee4-841d-76f2c50f7e7a\") device mount path \"/mnt/openstack/pv05\"" pod="glance-kuttl-tests/glance-default-external-api-0" Oct 11 04:12:43 crc kubenswrapper[4703]: I1011 04:12:43.926502 4703 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"360f7d83-4ffd-4ee4-841d-76f2c50f7e7a\") device mount path \"/mnt/openstack/pv12\"" pod="glance-kuttl-tests/glance-default-external-api-0" Oct 11 04:12:43 crc kubenswrapper[4703]: I1011 04:12:43.926695 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/360f7d83-4ffd-4ee4-841d-76f2c50f7e7a-sys\") pod \"glance-default-external-api-0\" (UID: \"360f7d83-4ffd-4ee4-841d-76f2c50f7e7a\") " pod="glance-kuttl-tests/glance-default-external-api-0" Oct 11 04:12:43 crc kubenswrapper[4703]: I1011 04:12:43.926704 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/360f7d83-4ffd-4ee4-841d-76f2c50f7e7a-lib-modules\") pod \"glance-default-external-api-0\" (UID: \"360f7d83-4ffd-4ee4-841d-76f2c50f7e7a\") " pod="glance-kuttl-tests/glance-default-external-api-0" Oct 11 04:12:43 crc kubenswrapper[4703]: I1011 04:12:43.926785 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/360f7d83-4ffd-4ee4-841d-76f2c50f7e7a-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"360f7d83-4ffd-4ee4-841d-76f2c50f7e7a\") " pod="glance-kuttl-tests/glance-default-external-api-0" Oct 11 04:12:43 crc kubenswrapper[4703]: I1011 04:12:43.930441 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/360f7d83-4ffd-4ee4-841d-76f2c50f7e7a-scripts\") pod \"glance-default-external-api-0\" (UID: \"360f7d83-4ffd-4ee4-841d-76f2c50f7e7a\") " pod="glance-kuttl-tests/glance-default-external-api-0" Oct 11 04:12:43 crc kubenswrapper[4703]: I1011 04:12:43.935897 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/360f7d83-4ffd-4ee4-841d-76f2c50f7e7a-config-data\") pod \"glance-default-external-api-0\" (UID: \"360f7d83-4ffd-4ee4-841d-76f2c50f7e7a\") " pod="glance-kuttl-tests/glance-default-external-api-0" Oct 11 04:12:43 crc kubenswrapper[4703]: I1011 04:12:43.944139 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"360f7d83-4ffd-4ee4-841d-76f2c50f7e7a\") " pod="glance-kuttl-tests/glance-default-external-api-0" Oct 11 04:12:43 crc kubenswrapper[4703]: I1011 04:12:43.953287 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pms2b\" (UniqueName: \"kubernetes.io/projected/360f7d83-4ffd-4ee4-841d-76f2c50f7e7a-kube-api-access-pms2b\") pod \"glance-default-external-api-0\" (UID: \"360f7d83-4ffd-4ee4-841d-76f2c50f7e7a\") " pod="glance-kuttl-tests/glance-default-external-api-0" Oct 11 04:12:44 crc kubenswrapper[4703]: I1011 04:12:44.001409 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"360f7d83-4ffd-4ee4-841d-76f2c50f7e7a\") " pod="glance-kuttl-tests/glance-default-external-api-0" Oct 11 04:12:44 crc kubenswrapper[4703]: I1011 04:12:44.026543 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"50a0634c-7268-4bc5-8c34-02fc4fdc9654\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Oct 11 04:12:44 crc kubenswrapper[4703]: I1011 04:12:44.026573 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/50a0634c-7268-4bc5-8c34-02fc4fdc9654-scripts\") pod \"glance-default-internal-api-0\" (UID: \"50a0634c-7268-4bc5-8c34-02fc4fdc9654\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Oct 11 04:12:44 crc kubenswrapper[4703]: I1011 04:12:44.026595 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6xclg\" (UniqueName: \"kubernetes.io/projected/50a0634c-7268-4bc5-8c34-02fc4fdc9654-kube-api-access-6xclg\") pod \"glance-default-internal-api-0\" (UID: \"50a0634c-7268-4bc5-8c34-02fc4fdc9654\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Oct 11 04:12:44 crc kubenswrapper[4703]: I1011 04:12:44.026621 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/50a0634c-7268-4bc5-8c34-02fc4fdc9654-etc-nvme\") pod \"glance-default-internal-api-0\" (UID: \"50a0634c-7268-4bc5-8c34-02fc4fdc9654\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Oct 11 04:12:44 crc kubenswrapper[4703]: I1011 04:12:44.026647 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/50a0634c-7268-4bc5-8c34-02fc4fdc9654-logs\") pod \"glance-default-internal-api-0\" (UID: \"50a0634c-7268-4bc5-8c34-02fc4fdc9654\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Oct 11 04:12:44 crc kubenswrapper[4703]: I1011 04:12:44.026663 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/50a0634c-7268-4bc5-8c34-02fc4fdc9654-etc-iscsi\") pod \"glance-default-internal-api-0\" (UID: \"50a0634c-7268-4bc5-8c34-02fc4fdc9654\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Oct 11 04:12:44 crc kubenswrapper[4703]: I1011 04:12:44.026685 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/50a0634c-7268-4bc5-8c34-02fc4fdc9654-lib-modules\") pod \"glance-default-internal-api-0\" (UID: \"50a0634c-7268-4bc5-8c34-02fc4fdc9654\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Oct 11 04:12:44 crc kubenswrapper[4703]: I1011 04:12:44.026712 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/50a0634c-7268-4bc5-8c34-02fc4fdc9654-sys\") pod \"glance-default-internal-api-0\" (UID: \"50a0634c-7268-4bc5-8c34-02fc4fdc9654\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Oct 11 04:12:44 crc kubenswrapper[4703]: I1011 04:12:44.026735 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/50a0634c-7268-4bc5-8c34-02fc4fdc9654-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"50a0634c-7268-4bc5-8c34-02fc4fdc9654\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Oct 11 04:12:44 crc kubenswrapper[4703]: I1011 04:12:44.026752 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"50a0634c-7268-4bc5-8c34-02fc4fdc9654\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Oct 11 04:12:44 crc kubenswrapper[4703]: I1011 04:12:44.026766 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/50a0634c-7268-4bc5-8c34-02fc4fdc9654-run\") pod \"glance-default-internal-api-0\" (UID: \"50a0634c-7268-4bc5-8c34-02fc4fdc9654\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Oct 11 04:12:44 crc kubenswrapper[4703]: I1011 04:12:44.026791 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/50a0634c-7268-4bc5-8c34-02fc4fdc9654-dev\") pod \"glance-default-internal-api-0\" (UID: \"50a0634c-7268-4bc5-8c34-02fc4fdc9654\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Oct 11 04:12:44 crc kubenswrapper[4703]: I1011 04:12:44.026814 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/50a0634c-7268-4bc5-8c34-02fc4fdc9654-var-locks-brick\") pod \"glance-default-internal-api-0\" (UID: \"50a0634c-7268-4bc5-8c34-02fc4fdc9654\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Oct 11 04:12:44 crc kubenswrapper[4703]: I1011 04:12:44.026829 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/50a0634c-7268-4bc5-8c34-02fc4fdc9654-config-data\") pod \"glance-default-internal-api-0\" (UID: \"50a0634c-7268-4bc5-8c34-02fc4fdc9654\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Oct 11 04:12:44 crc kubenswrapper[4703]: I1011 04:12:44.027272 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/50a0634c-7268-4bc5-8c34-02fc4fdc9654-lib-modules\") pod \"glance-default-internal-api-0\" (UID: \"50a0634c-7268-4bc5-8c34-02fc4fdc9654\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Oct 11 04:12:44 crc kubenswrapper[4703]: I1011 04:12:44.027355 4703 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"50a0634c-7268-4bc5-8c34-02fc4fdc9654\") device mount path \"/mnt/openstack/pv09\"" pod="glance-kuttl-tests/glance-default-internal-api-0" Oct 11 04:12:44 crc kubenswrapper[4703]: I1011 04:12:44.027651 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/50a0634c-7268-4bc5-8c34-02fc4fdc9654-sys\") pod \"glance-default-internal-api-0\" (UID: \"50a0634c-7268-4bc5-8c34-02fc4fdc9654\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Oct 11 04:12:44 crc kubenswrapper[4703]: I1011 04:12:44.028177 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/50a0634c-7268-4bc5-8c34-02fc4fdc9654-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"50a0634c-7268-4bc5-8c34-02fc4fdc9654\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Oct 11 04:12:44 crc kubenswrapper[4703]: I1011 04:12:44.028313 4703 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"50a0634c-7268-4bc5-8c34-02fc4fdc9654\") device mount path \"/mnt/openstack/pv08\"" pod="glance-kuttl-tests/glance-default-internal-api-0" Oct 11 04:12:44 crc kubenswrapper[4703]: I1011 04:12:44.028990 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/50a0634c-7268-4bc5-8c34-02fc4fdc9654-run\") pod \"glance-default-internal-api-0\" (UID: \"50a0634c-7268-4bc5-8c34-02fc4fdc9654\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Oct 11 04:12:44 crc kubenswrapper[4703]: I1011 04:12:44.029034 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/50a0634c-7268-4bc5-8c34-02fc4fdc9654-dev\") pod \"glance-default-internal-api-0\" (UID: \"50a0634c-7268-4bc5-8c34-02fc4fdc9654\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Oct 11 04:12:44 crc kubenswrapper[4703]: I1011 04:12:44.029072 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/50a0634c-7268-4bc5-8c34-02fc4fdc9654-var-locks-brick\") pod \"glance-default-internal-api-0\" (UID: \"50a0634c-7268-4bc5-8c34-02fc4fdc9654\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Oct 11 04:12:44 crc kubenswrapper[4703]: I1011 04:12:44.029110 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/50a0634c-7268-4bc5-8c34-02fc4fdc9654-etc-nvme\") pod \"glance-default-internal-api-0\" (UID: \"50a0634c-7268-4bc5-8c34-02fc4fdc9654\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Oct 11 04:12:44 crc kubenswrapper[4703]: I1011 04:12:44.029554 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/50a0634c-7268-4bc5-8c34-02fc4fdc9654-logs\") pod \"glance-default-internal-api-0\" (UID: \"50a0634c-7268-4bc5-8c34-02fc4fdc9654\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Oct 11 04:12:44 crc kubenswrapper[4703]: I1011 04:12:44.029615 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/50a0634c-7268-4bc5-8c34-02fc4fdc9654-etc-iscsi\") pod \"glance-default-internal-api-0\" (UID: \"50a0634c-7268-4bc5-8c34-02fc4fdc9654\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Oct 11 04:12:44 crc kubenswrapper[4703]: I1011 04:12:44.030760 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/50a0634c-7268-4bc5-8c34-02fc4fdc9654-config-data\") pod \"glance-default-internal-api-0\" (UID: \"50a0634c-7268-4bc5-8c34-02fc4fdc9654\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Oct 11 04:12:44 crc kubenswrapper[4703]: I1011 04:12:44.034132 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/50a0634c-7268-4bc5-8c34-02fc4fdc9654-scripts\") pod \"glance-default-internal-api-0\" (UID: \"50a0634c-7268-4bc5-8c34-02fc4fdc9654\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Oct 11 04:12:44 crc kubenswrapper[4703]: I1011 04:12:44.045229 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"50a0634c-7268-4bc5-8c34-02fc4fdc9654\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Oct 11 04:12:44 crc kubenswrapper[4703]: I1011 04:12:44.046936 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6xclg\" (UniqueName: \"kubernetes.io/projected/50a0634c-7268-4bc5-8c34-02fc4fdc9654-kube-api-access-6xclg\") pod \"glance-default-internal-api-0\" (UID: \"50a0634c-7268-4bc5-8c34-02fc4fdc9654\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Oct 11 04:12:44 crc kubenswrapper[4703]: I1011 04:12:44.047794 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-0" Oct 11 04:12:44 crc kubenswrapper[4703]: I1011 04:12:44.051479 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"50a0634c-7268-4bc5-8c34-02fc4fdc9654\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Oct 11 04:12:44 crc kubenswrapper[4703]: I1011 04:12:44.141929 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-0" Oct 11 04:12:44 crc kubenswrapper[4703]: I1011 04:12:44.275433 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-0"] Oct 11 04:12:44 crc kubenswrapper[4703]: I1011 04:12:44.369991 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Oct 11 04:12:44 crc kubenswrapper[4703]: W1011 04:12:44.377651 4703 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod50a0634c_7268_4bc5_8c34_02fc4fdc9654.slice/crio-8b76f02a75ed7fc373fe8c550c0d8be1c554957dcdb7dc24e9c261345c4ed1b3 WatchSource:0}: Error finding container 8b76f02a75ed7fc373fe8c550c0d8be1c554957dcdb7dc24e9c261345c4ed1b3: Status 404 returned error can't find the container with id 8b76f02a75ed7fc373fe8c550c0d8be1c554957dcdb7dc24e9c261345c4ed1b3 Oct 11 04:12:44 crc kubenswrapper[4703]: I1011 04:12:44.545029 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"50a0634c-7268-4bc5-8c34-02fc4fdc9654","Type":"ContainerStarted","Data":"471bb93793c489b912e0f8336fab4cd410e4fec5accf2264987b6cb91a61571c"} Oct 11 04:12:44 crc kubenswrapper[4703]: I1011 04:12:44.545337 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"50a0634c-7268-4bc5-8c34-02fc4fdc9654","Type":"ContainerStarted","Data":"8b76f02a75ed7fc373fe8c550c0d8be1c554957dcdb7dc24e9c261345c4ed1b3"} Oct 11 04:12:44 crc kubenswrapper[4703]: I1011 04:12:44.546772 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-0" event={"ID":"360f7d83-4ffd-4ee4-841d-76f2c50f7e7a","Type":"ContainerStarted","Data":"87ff9087195a64461ac3d498272e3e87c7e32a8fad39a66e3db5b84ae0f81799"} Oct 11 04:12:44 crc kubenswrapper[4703]: I1011 04:12:44.546818 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-0" event={"ID":"360f7d83-4ffd-4ee4-841d-76f2c50f7e7a","Type":"ContainerStarted","Data":"63453325c18ca1b3395579d6bf74bed6ec1c4647fcb80e72b0ad923daf65c71c"} Oct 11 04:12:45 crc kubenswrapper[4703]: I1011 04:12:45.079438 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Oct 11 04:12:45 crc kubenswrapper[4703]: I1011 04:12:45.557850 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"50a0634c-7268-4bc5-8c34-02fc4fdc9654","Type":"ContainerStarted","Data":"56d278850949ca06162a94b6a35161a79a372c13cc2fd9cf5b0da3e5d577aa43"} Oct 11 04:12:45 crc kubenswrapper[4703]: I1011 04:12:45.558180 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"50a0634c-7268-4bc5-8c34-02fc4fdc9654","Type":"ContainerStarted","Data":"98d900138720af28fa185d7aeb07550642a81d18f2b4be1fe8c06eb44916fff2"} Oct 11 04:12:45 crc kubenswrapper[4703]: I1011 04:12:45.559957 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-0" event={"ID":"360f7d83-4ffd-4ee4-841d-76f2c50f7e7a","Type":"ContainerStarted","Data":"b8a971a2b1f461f77ec5dd9a91fe02f33192b4ad4bbd6cb4c34d8581f4c4c3bf"} Oct 11 04:12:45 crc kubenswrapper[4703]: I1011 04:12:45.560199 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-0" event={"ID":"360f7d83-4ffd-4ee4-841d-76f2c50f7e7a","Type":"ContainerStarted","Data":"9aaab7928fb92b61d22843c4be8e83289ed785c5da5d33e7e929f3d7228e6904"} Oct 11 04:12:45 crc kubenswrapper[4703]: I1011 04:12:45.585930 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-internal-api-0" podStartSLOduration=3.5859106069999998 podStartE2EDuration="3.585910607s" podCreationTimestamp="2025-10-11 04:12:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 04:12:45.580826434 +0000 UTC m=+1056.791308356" watchObservedRunningTime="2025-10-11 04:12:45.585910607 +0000 UTC m=+1056.796392529" Oct 11 04:12:45 crc kubenswrapper[4703]: I1011 04:12:45.611518 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-external-api-0" podStartSLOduration=2.611490894 podStartE2EDuration="2.611490894s" podCreationTimestamp="2025-10-11 04:12:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 04:12:45.605257302 +0000 UTC m=+1056.815739234" watchObservedRunningTime="2025-10-11 04:12:45.611490894 +0000 UTC m=+1056.821972816" Oct 11 04:12:46 crc kubenswrapper[4703]: I1011 04:12:46.567392 4703 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-internal-api-0" podUID="50a0634c-7268-4bc5-8c34-02fc4fdc9654" containerName="glance-log" containerID="cri-o://471bb93793c489b912e0f8336fab4cd410e4fec5accf2264987b6cb91a61571c" gracePeriod=30 Oct 11 04:12:46 crc kubenswrapper[4703]: I1011 04:12:46.567453 4703 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-internal-api-0" podUID="50a0634c-7268-4bc5-8c34-02fc4fdc9654" containerName="glance-httpd" containerID="cri-o://98d900138720af28fa185d7aeb07550642a81d18f2b4be1fe8c06eb44916fff2" gracePeriod=30 Oct 11 04:12:46 crc kubenswrapper[4703]: I1011 04:12:46.567519 4703 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-internal-api-0" podUID="50a0634c-7268-4bc5-8c34-02fc4fdc9654" containerName="glance-api" containerID="cri-o://56d278850949ca06162a94b6a35161a79a372c13cc2fd9cf5b0da3e5d577aa43" gracePeriod=30 Oct 11 04:12:47 crc kubenswrapper[4703]: I1011 04:12:47.598998 4703 generic.go:334] "Generic (PLEG): container finished" podID="50a0634c-7268-4bc5-8c34-02fc4fdc9654" containerID="56d278850949ca06162a94b6a35161a79a372c13cc2fd9cf5b0da3e5d577aa43" exitCode=143 Oct 11 04:12:47 crc kubenswrapper[4703]: I1011 04:12:47.599398 4703 generic.go:334] "Generic (PLEG): container finished" podID="50a0634c-7268-4bc5-8c34-02fc4fdc9654" containerID="98d900138720af28fa185d7aeb07550642a81d18f2b4be1fe8c06eb44916fff2" exitCode=0 Oct 11 04:12:47 crc kubenswrapper[4703]: I1011 04:12:47.599414 4703 generic.go:334] "Generic (PLEG): container finished" podID="50a0634c-7268-4bc5-8c34-02fc4fdc9654" containerID="471bb93793c489b912e0f8336fab4cd410e4fec5accf2264987b6cb91a61571c" exitCode=143 Oct 11 04:12:47 crc kubenswrapper[4703]: I1011 04:12:47.599099 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"50a0634c-7268-4bc5-8c34-02fc4fdc9654","Type":"ContainerDied","Data":"56d278850949ca06162a94b6a35161a79a372c13cc2fd9cf5b0da3e5d577aa43"} Oct 11 04:12:47 crc kubenswrapper[4703]: I1011 04:12:47.599497 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"50a0634c-7268-4bc5-8c34-02fc4fdc9654","Type":"ContainerDied","Data":"98d900138720af28fa185d7aeb07550642a81d18f2b4be1fe8c06eb44916fff2"} Oct 11 04:12:47 crc kubenswrapper[4703]: I1011 04:12:47.599539 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"50a0634c-7268-4bc5-8c34-02fc4fdc9654","Type":"ContainerDied","Data":"471bb93793c489b912e0f8336fab4cd410e4fec5accf2264987b6cb91a61571c"} Oct 11 04:12:47 crc kubenswrapper[4703]: I1011 04:12:47.599559 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"50a0634c-7268-4bc5-8c34-02fc4fdc9654","Type":"ContainerDied","Data":"8b76f02a75ed7fc373fe8c550c0d8be1c554957dcdb7dc24e9c261345c4ed1b3"} Oct 11 04:12:47 crc kubenswrapper[4703]: I1011 04:12:47.599575 4703 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8b76f02a75ed7fc373fe8c550c0d8be1c554957dcdb7dc24e9c261345c4ed1b3" Oct 11 04:12:47 crc kubenswrapper[4703]: I1011 04:12:47.638188 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-0" Oct 11 04:12:47 crc kubenswrapper[4703]: I1011 04:12:47.728222 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/50a0634c-7268-4bc5-8c34-02fc4fdc9654-logs\") pod \"50a0634c-7268-4bc5-8c34-02fc4fdc9654\" (UID: \"50a0634c-7268-4bc5-8c34-02fc4fdc9654\") " Oct 11 04:12:47 crc kubenswrapper[4703]: I1011 04:12:47.728587 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"50a0634c-7268-4bc5-8c34-02fc4fdc9654\" (UID: \"50a0634c-7268-4bc5-8c34-02fc4fdc9654\") " Oct 11 04:12:47 crc kubenswrapper[4703]: I1011 04:12:47.728741 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"50a0634c-7268-4bc5-8c34-02fc4fdc9654\" (UID: \"50a0634c-7268-4bc5-8c34-02fc4fdc9654\") " Oct 11 04:12:47 crc kubenswrapper[4703]: I1011 04:12:47.728812 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/50a0634c-7268-4bc5-8c34-02fc4fdc9654-logs" (OuterVolumeSpecName: "logs") pod "50a0634c-7268-4bc5-8c34-02fc4fdc9654" (UID: "50a0634c-7268-4bc5-8c34-02fc4fdc9654"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 04:12:47 crc kubenswrapper[4703]: I1011 04:12:47.728929 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6xclg\" (UniqueName: \"kubernetes.io/projected/50a0634c-7268-4bc5-8c34-02fc4fdc9654-kube-api-access-6xclg\") pod \"50a0634c-7268-4bc5-8c34-02fc4fdc9654\" (UID: \"50a0634c-7268-4bc5-8c34-02fc4fdc9654\") " Oct 11 04:12:47 crc kubenswrapper[4703]: I1011 04:12:47.729021 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/50a0634c-7268-4bc5-8c34-02fc4fdc9654-etc-nvme\") pod \"50a0634c-7268-4bc5-8c34-02fc4fdc9654\" (UID: \"50a0634c-7268-4bc5-8c34-02fc4fdc9654\") " Oct 11 04:12:47 crc kubenswrapper[4703]: I1011 04:12:47.729129 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/50a0634c-7268-4bc5-8c34-02fc4fdc9654-etc-iscsi\") pod \"50a0634c-7268-4bc5-8c34-02fc4fdc9654\" (UID: \"50a0634c-7268-4bc5-8c34-02fc4fdc9654\") " Oct 11 04:12:47 crc kubenswrapper[4703]: I1011 04:12:47.729227 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/50a0634c-7268-4bc5-8c34-02fc4fdc9654-var-locks-brick\") pod \"50a0634c-7268-4bc5-8c34-02fc4fdc9654\" (UID: \"50a0634c-7268-4bc5-8c34-02fc4fdc9654\") " Oct 11 04:12:47 crc kubenswrapper[4703]: I1011 04:12:47.729313 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/50a0634c-7268-4bc5-8c34-02fc4fdc9654-run\") pod \"50a0634c-7268-4bc5-8c34-02fc4fdc9654\" (UID: \"50a0634c-7268-4bc5-8c34-02fc4fdc9654\") " Oct 11 04:12:47 crc kubenswrapper[4703]: I1011 04:12:47.729480 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/50a0634c-7268-4bc5-8c34-02fc4fdc9654-dev\") pod \"50a0634c-7268-4bc5-8c34-02fc4fdc9654\" (UID: \"50a0634c-7268-4bc5-8c34-02fc4fdc9654\") " Oct 11 04:12:47 crc kubenswrapper[4703]: I1011 04:12:47.729607 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/50a0634c-7268-4bc5-8c34-02fc4fdc9654-config-data\") pod \"50a0634c-7268-4bc5-8c34-02fc4fdc9654\" (UID: \"50a0634c-7268-4bc5-8c34-02fc4fdc9654\") " Oct 11 04:12:47 crc kubenswrapper[4703]: I1011 04:12:47.729704 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/50a0634c-7268-4bc5-8c34-02fc4fdc9654-lib-modules\") pod \"50a0634c-7268-4bc5-8c34-02fc4fdc9654\" (UID: \"50a0634c-7268-4bc5-8c34-02fc4fdc9654\") " Oct 11 04:12:47 crc kubenswrapper[4703]: I1011 04:12:47.729789 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/50a0634c-7268-4bc5-8c34-02fc4fdc9654-sys\") pod \"50a0634c-7268-4bc5-8c34-02fc4fdc9654\" (UID: \"50a0634c-7268-4bc5-8c34-02fc4fdc9654\") " Oct 11 04:12:47 crc kubenswrapper[4703]: I1011 04:12:47.729902 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/50a0634c-7268-4bc5-8c34-02fc4fdc9654-httpd-run\") pod \"50a0634c-7268-4bc5-8c34-02fc4fdc9654\" (UID: \"50a0634c-7268-4bc5-8c34-02fc4fdc9654\") " Oct 11 04:12:47 crc kubenswrapper[4703]: I1011 04:12:47.730023 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/50a0634c-7268-4bc5-8c34-02fc4fdc9654-scripts\") pod \"50a0634c-7268-4bc5-8c34-02fc4fdc9654\" (UID: \"50a0634c-7268-4bc5-8c34-02fc4fdc9654\") " Oct 11 04:12:47 crc kubenswrapper[4703]: I1011 04:12:47.729632 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/50a0634c-7268-4bc5-8c34-02fc4fdc9654-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "50a0634c-7268-4bc5-8c34-02fc4fdc9654" (UID: "50a0634c-7268-4bc5-8c34-02fc4fdc9654"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 11 04:12:47 crc kubenswrapper[4703]: I1011 04:12:47.729794 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/50a0634c-7268-4bc5-8c34-02fc4fdc9654-run" (OuterVolumeSpecName: "run") pod "50a0634c-7268-4bc5-8c34-02fc4fdc9654" (UID: "50a0634c-7268-4bc5-8c34-02fc4fdc9654"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 11 04:12:47 crc kubenswrapper[4703]: I1011 04:12:47.729837 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/50a0634c-7268-4bc5-8c34-02fc4fdc9654-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "50a0634c-7268-4bc5-8c34-02fc4fdc9654" (UID: "50a0634c-7268-4bc5-8c34-02fc4fdc9654"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 11 04:12:47 crc kubenswrapper[4703]: I1011 04:12:47.729871 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/50a0634c-7268-4bc5-8c34-02fc4fdc9654-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "50a0634c-7268-4bc5-8c34-02fc4fdc9654" (UID: "50a0634c-7268-4bc5-8c34-02fc4fdc9654"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 11 04:12:47 crc kubenswrapper[4703]: I1011 04:12:47.730089 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/50a0634c-7268-4bc5-8c34-02fc4fdc9654-dev" (OuterVolumeSpecName: "dev") pod "50a0634c-7268-4bc5-8c34-02fc4fdc9654" (UID: "50a0634c-7268-4bc5-8c34-02fc4fdc9654"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 11 04:12:47 crc kubenswrapper[4703]: I1011 04:12:47.730141 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/50a0634c-7268-4bc5-8c34-02fc4fdc9654-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "50a0634c-7268-4bc5-8c34-02fc4fdc9654" (UID: "50a0634c-7268-4bc5-8c34-02fc4fdc9654"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 11 04:12:47 crc kubenswrapper[4703]: I1011 04:12:47.730297 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/50a0634c-7268-4bc5-8c34-02fc4fdc9654-sys" (OuterVolumeSpecName: "sys") pod "50a0634c-7268-4bc5-8c34-02fc4fdc9654" (UID: "50a0634c-7268-4bc5-8c34-02fc4fdc9654"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 11 04:12:47 crc kubenswrapper[4703]: I1011 04:12:47.730876 4703 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/50a0634c-7268-4bc5-8c34-02fc4fdc9654-run\") on node \"crc\" DevicePath \"\"" Oct 11 04:12:47 crc kubenswrapper[4703]: I1011 04:12:47.731006 4703 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/50a0634c-7268-4bc5-8c34-02fc4fdc9654-dev\") on node \"crc\" DevicePath \"\"" Oct 11 04:12:47 crc kubenswrapper[4703]: I1011 04:12:47.731109 4703 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/50a0634c-7268-4bc5-8c34-02fc4fdc9654-lib-modules\") on node \"crc\" DevicePath \"\"" Oct 11 04:12:47 crc kubenswrapper[4703]: I1011 04:12:47.731258 4703 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/50a0634c-7268-4bc5-8c34-02fc4fdc9654-sys\") on node \"crc\" DevicePath \"\"" Oct 11 04:12:47 crc kubenswrapper[4703]: I1011 04:12:47.731356 4703 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/50a0634c-7268-4bc5-8c34-02fc4fdc9654-logs\") on node \"crc\" DevicePath \"\"" Oct 11 04:12:47 crc kubenswrapper[4703]: I1011 04:12:47.731484 4703 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/50a0634c-7268-4bc5-8c34-02fc4fdc9654-etc-nvme\") on node \"crc\" DevicePath \"\"" Oct 11 04:12:47 crc kubenswrapper[4703]: I1011 04:12:47.731606 4703 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/50a0634c-7268-4bc5-8c34-02fc4fdc9654-etc-iscsi\") on node \"crc\" DevicePath \"\"" Oct 11 04:12:47 crc kubenswrapper[4703]: I1011 04:12:47.731691 4703 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/50a0634c-7268-4bc5-8c34-02fc4fdc9654-var-locks-brick\") on node \"crc\" DevicePath \"\"" Oct 11 04:12:47 crc kubenswrapper[4703]: I1011 04:12:47.730933 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/50a0634c-7268-4bc5-8c34-02fc4fdc9654-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "50a0634c-7268-4bc5-8c34-02fc4fdc9654" (UID: "50a0634c-7268-4bc5-8c34-02fc4fdc9654"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 04:12:47 crc kubenswrapper[4703]: I1011 04:12:47.737212 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "glance") pod "50a0634c-7268-4bc5-8c34-02fc4fdc9654" (UID: "50a0634c-7268-4bc5-8c34-02fc4fdc9654"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 11 04:12:47 crc kubenswrapper[4703]: I1011 04:12:47.737828 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50a0634c-7268-4bc5-8c34-02fc4fdc9654-scripts" (OuterVolumeSpecName: "scripts") pod "50a0634c-7268-4bc5-8c34-02fc4fdc9654" (UID: "50a0634c-7268-4bc5-8c34-02fc4fdc9654"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 04:12:47 crc kubenswrapper[4703]: I1011 04:12:47.738618 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "glance-cache") pod "50a0634c-7268-4bc5-8c34-02fc4fdc9654" (UID: "50a0634c-7268-4bc5-8c34-02fc4fdc9654"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 11 04:12:47 crc kubenswrapper[4703]: I1011 04:12:47.751173 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/50a0634c-7268-4bc5-8c34-02fc4fdc9654-kube-api-access-6xclg" (OuterVolumeSpecName: "kube-api-access-6xclg") pod "50a0634c-7268-4bc5-8c34-02fc4fdc9654" (UID: "50a0634c-7268-4bc5-8c34-02fc4fdc9654"). InnerVolumeSpecName "kube-api-access-6xclg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 04:12:47 crc kubenswrapper[4703]: I1011 04:12:47.833270 4703 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/50a0634c-7268-4bc5-8c34-02fc4fdc9654-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 11 04:12:47 crc kubenswrapper[4703]: I1011 04:12:47.833321 4703 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/50a0634c-7268-4bc5-8c34-02fc4fdc9654-scripts\") on node \"crc\" DevicePath \"\"" Oct 11 04:12:47 crc kubenswrapper[4703]: I1011 04:12:47.833370 4703 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Oct 11 04:12:47 crc kubenswrapper[4703]: I1011 04:12:47.833394 4703 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Oct 11 04:12:47 crc kubenswrapper[4703]: I1011 04:12:47.833426 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6xclg\" (UniqueName: \"kubernetes.io/projected/50a0634c-7268-4bc5-8c34-02fc4fdc9654-kube-api-access-6xclg\") on node \"crc\" DevicePath \"\"" Oct 11 04:12:47 crc kubenswrapper[4703]: I1011 04:12:47.844358 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50a0634c-7268-4bc5-8c34-02fc4fdc9654-config-data" (OuterVolumeSpecName: "config-data") pod "50a0634c-7268-4bc5-8c34-02fc4fdc9654" (UID: "50a0634c-7268-4bc5-8c34-02fc4fdc9654"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 04:12:47 crc kubenswrapper[4703]: I1011 04:12:47.853156 4703 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Oct 11 04:12:47 crc kubenswrapper[4703]: I1011 04:12:47.864680 4703 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Oct 11 04:12:47 crc kubenswrapper[4703]: I1011 04:12:47.935322 4703 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Oct 11 04:12:47 crc kubenswrapper[4703]: I1011 04:12:47.935369 4703 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Oct 11 04:12:47 crc kubenswrapper[4703]: I1011 04:12:47.935387 4703 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/50a0634c-7268-4bc5-8c34-02fc4fdc9654-config-data\") on node \"crc\" DevicePath \"\"" Oct 11 04:12:48 crc kubenswrapper[4703]: I1011 04:12:48.607872 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-0" Oct 11 04:12:48 crc kubenswrapper[4703]: I1011 04:12:48.642147 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Oct 11 04:12:48 crc kubenswrapper[4703]: I1011 04:12:48.647028 4703 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Oct 11 04:12:48 crc kubenswrapper[4703]: I1011 04:12:48.686775 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Oct 11 04:12:48 crc kubenswrapper[4703]: E1011 04:12:48.687560 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50a0634c-7268-4bc5-8c34-02fc4fdc9654" containerName="glance-log" Oct 11 04:12:48 crc kubenswrapper[4703]: I1011 04:12:48.687708 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="50a0634c-7268-4bc5-8c34-02fc4fdc9654" containerName="glance-log" Oct 11 04:12:48 crc kubenswrapper[4703]: E1011 04:12:48.687840 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50a0634c-7268-4bc5-8c34-02fc4fdc9654" containerName="glance-api" Oct 11 04:12:48 crc kubenswrapper[4703]: I1011 04:12:48.687953 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="50a0634c-7268-4bc5-8c34-02fc4fdc9654" containerName="glance-api" Oct 11 04:12:48 crc kubenswrapper[4703]: E1011 04:12:48.688067 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50a0634c-7268-4bc5-8c34-02fc4fdc9654" containerName="glance-httpd" Oct 11 04:12:48 crc kubenswrapper[4703]: I1011 04:12:48.688204 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="50a0634c-7268-4bc5-8c34-02fc4fdc9654" containerName="glance-httpd" Oct 11 04:12:48 crc kubenswrapper[4703]: I1011 04:12:48.688796 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="50a0634c-7268-4bc5-8c34-02fc4fdc9654" containerName="glance-api" Oct 11 04:12:48 crc kubenswrapper[4703]: I1011 04:12:48.688926 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="50a0634c-7268-4bc5-8c34-02fc4fdc9654" containerName="glance-log" Oct 11 04:12:48 crc kubenswrapper[4703]: I1011 04:12:48.689048 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="50a0634c-7268-4bc5-8c34-02fc4fdc9654" containerName="glance-httpd" Oct 11 04:12:48 crc kubenswrapper[4703]: I1011 04:12:48.690825 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-0" Oct 11 04:12:48 crc kubenswrapper[4703]: I1011 04:12:48.695512 4703 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-default-internal-config-data" Oct 11 04:12:48 crc kubenswrapper[4703]: I1011 04:12:48.716372 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Oct 11 04:12:48 crc kubenswrapper[4703]: I1011 04:12:48.749388 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e0cfe194-3f37-44ca-b774-d8c583d7b2df-sys\") pod \"glance-default-internal-api-0\" (UID: \"e0cfe194-3f37-44ca-b774-d8c583d7b2df\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Oct 11 04:12:48 crc kubenswrapper[4703]: I1011 04:12:48.749497 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/e0cfe194-3f37-44ca-b774-d8c583d7b2df-etc-iscsi\") pod \"glance-default-internal-api-0\" (UID: \"e0cfe194-3f37-44ca-b774-d8c583d7b2df\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Oct 11 04:12:48 crc kubenswrapper[4703]: I1011 04:12:48.749529 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"e0cfe194-3f37-44ca-b774-d8c583d7b2df\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Oct 11 04:12:48 crc kubenswrapper[4703]: I1011 04:12:48.749596 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/e0cfe194-3f37-44ca-b774-d8c583d7b2df-var-locks-brick\") pod \"glance-default-internal-api-0\" (UID: \"e0cfe194-3f37-44ca-b774-d8c583d7b2df\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Oct 11 04:12:48 crc kubenswrapper[4703]: I1011 04:12:48.749620 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/e0cfe194-3f37-44ca-b774-d8c583d7b2df-run\") pod \"glance-default-internal-api-0\" (UID: \"e0cfe194-3f37-44ca-b774-d8c583d7b2df\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Oct 11 04:12:48 crc kubenswrapper[4703]: I1011 04:12:48.749732 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e0cfe194-3f37-44ca-b774-d8c583d7b2df-lib-modules\") pod \"glance-default-internal-api-0\" (UID: \"e0cfe194-3f37-44ca-b774-d8c583d7b2df\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Oct 11 04:12:48 crc kubenswrapper[4703]: I1011 04:12:48.749756 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/e0cfe194-3f37-44ca-b774-d8c583d7b2df-dev\") pod \"glance-default-internal-api-0\" (UID: \"e0cfe194-3f37-44ca-b774-d8c583d7b2df\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Oct 11 04:12:48 crc kubenswrapper[4703]: I1011 04:12:48.749780 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f6w8r\" (UniqueName: \"kubernetes.io/projected/e0cfe194-3f37-44ca-b774-d8c583d7b2df-kube-api-access-f6w8r\") pod \"glance-default-internal-api-0\" (UID: \"e0cfe194-3f37-44ca-b774-d8c583d7b2df\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Oct 11 04:12:48 crc kubenswrapper[4703]: I1011 04:12:48.749835 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e0cfe194-3f37-44ca-b774-d8c583d7b2df-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"e0cfe194-3f37-44ca-b774-d8c583d7b2df\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Oct 11 04:12:48 crc kubenswrapper[4703]: I1011 04:12:48.749866 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"e0cfe194-3f37-44ca-b774-d8c583d7b2df\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Oct 11 04:12:48 crc kubenswrapper[4703]: I1011 04:12:48.749908 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0cfe194-3f37-44ca-b774-d8c583d7b2df-config-data\") pod \"glance-default-internal-api-0\" (UID: \"e0cfe194-3f37-44ca-b774-d8c583d7b2df\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Oct 11 04:12:48 crc kubenswrapper[4703]: I1011 04:12:48.749969 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e0cfe194-3f37-44ca-b774-d8c583d7b2df-scripts\") pod \"glance-default-internal-api-0\" (UID: \"e0cfe194-3f37-44ca-b774-d8c583d7b2df\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Oct 11 04:12:48 crc kubenswrapper[4703]: I1011 04:12:48.750031 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/e0cfe194-3f37-44ca-b774-d8c583d7b2df-etc-nvme\") pod \"glance-default-internal-api-0\" (UID: \"e0cfe194-3f37-44ca-b774-d8c583d7b2df\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Oct 11 04:12:48 crc kubenswrapper[4703]: I1011 04:12:48.750054 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e0cfe194-3f37-44ca-b774-d8c583d7b2df-logs\") pod \"glance-default-internal-api-0\" (UID: \"e0cfe194-3f37-44ca-b774-d8c583d7b2df\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Oct 11 04:12:48 crc kubenswrapper[4703]: I1011 04:12:48.851523 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"e0cfe194-3f37-44ca-b774-d8c583d7b2df\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Oct 11 04:12:48 crc kubenswrapper[4703]: I1011 04:12:48.851565 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0cfe194-3f37-44ca-b774-d8c583d7b2df-config-data\") pod \"glance-default-internal-api-0\" (UID: \"e0cfe194-3f37-44ca-b774-d8c583d7b2df\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Oct 11 04:12:48 crc kubenswrapper[4703]: I1011 04:12:48.851602 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e0cfe194-3f37-44ca-b774-d8c583d7b2df-scripts\") pod \"glance-default-internal-api-0\" (UID: \"e0cfe194-3f37-44ca-b774-d8c583d7b2df\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Oct 11 04:12:48 crc kubenswrapper[4703]: I1011 04:12:48.851640 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/e0cfe194-3f37-44ca-b774-d8c583d7b2df-etc-nvme\") pod \"glance-default-internal-api-0\" (UID: \"e0cfe194-3f37-44ca-b774-d8c583d7b2df\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Oct 11 04:12:48 crc kubenswrapper[4703]: I1011 04:12:48.851655 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e0cfe194-3f37-44ca-b774-d8c583d7b2df-logs\") pod \"glance-default-internal-api-0\" (UID: \"e0cfe194-3f37-44ca-b774-d8c583d7b2df\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Oct 11 04:12:48 crc kubenswrapper[4703]: I1011 04:12:48.851675 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e0cfe194-3f37-44ca-b774-d8c583d7b2df-sys\") pod \"glance-default-internal-api-0\" (UID: \"e0cfe194-3f37-44ca-b774-d8c583d7b2df\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Oct 11 04:12:48 crc kubenswrapper[4703]: I1011 04:12:48.851700 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/e0cfe194-3f37-44ca-b774-d8c583d7b2df-etc-iscsi\") pod \"glance-default-internal-api-0\" (UID: \"e0cfe194-3f37-44ca-b774-d8c583d7b2df\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Oct 11 04:12:48 crc kubenswrapper[4703]: I1011 04:12:48.851717 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"e0cfe194-3f37-44ca-b774-d8c583d7b2df\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Oct 11 04:12:48 crc kubenswrapper[4703]: I1011 04:12:48.851750 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/e0cfe194-3f37-44ca-b774-d8c583d7b2df-var-locks-brick\") pod \"glance-default-internal-api-0\" (UID: \"e0cfe194-3f37-44ca-b774-d8c583d7b2df\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Oct 11 04:12:48 crc kubenswrapper[4703]: I1011 04:12:48.851767 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/e0cfe194-3f37-44ca-b774-d8c583d7b2df-run\") pod \"glance-default-internal-api-0\" (UID: \"e0cfe194-3f37-44ca-b774-d8c583d7b2df\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Oct 11 04:12:48 crc kubenswrapper[4703]: I1011 04:12:48.851776 4703 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"e0cfe194-3f37-44ca-b774-d8c583d7b2df\") device mount path \"/mnt/openstack/pv09\"" pod="glance-kuttl-tests/glance-default-internal-api-0" Oct 11 04:12:48 crc kubenswrapper[4703]: I1011 04:12:48.851822 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/e0cfe194-3f37-44ca-b774-d8c583d7b2df-dev\") pod \"glance-default-internal-api-0\" (UID: \"e0cfe194-3f37-44ca-b774-d8c583d7b2df\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Oct 11 04:12:48 crc kubenswrapper[4703]: I1011 04:12:48.851856 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e0cfe194-3f37-44ca-b774-d8c583d7b2df-sys\") pod \"glance-default-internal-api-0\" (UID: \"e0cfe194-3f37-44ca-b774-d8c583d7b2df\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Oct 11 04:12:48 crc kubenswrapper[4703]: I1011 04:12:48.851877 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/e0cfe194-3f37-44ca-b774-d8c583d7b2df-etc-iscsi\") pod \"glance-default-internal-api-0\" (UID: \"e0cfe194-3f37-44ca-b774-d8c583d7b2df\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Oct 11 04:12:48 crc kubenswrapper[4703]: I1011 04:12:48.851949 4703 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"e0cfe194-3f37-44ca-b774-d8c583d7b2df\") device mount path \"/mnt/openstack/pv08\"" pod="glance-kuttl-tests/glance-default-internal-api-0" Oct 11 04:12:48 crc kubenswrapper[4703]: I1011 04:12:48.852183 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e0cfe194-3f37-44ca-b774-d8c583d7b2df-logs\") pod \"glance-default-internal-api-0\" (UID: \"e0cfe194-3f37-44ca-b774-d8c583d7b2df\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Oct 11 04:12:48 crc kubenswrapper[4703]: I1011 04:12:48.852240 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/e0cfe194-3f37-44ca-b774-d8c583d7b2df-etc-nvme\") pod \"glance-default-internal-api-0\" (UID: \"e0cfe194-3f37-44ca-b774-d8c583d7b2df\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Oct 11 04:12:48 crc kubenswrapper[4703]: I1011 04:12:48.852276 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/e0cfe194-3f37-44ca-b774-d8c583d7b2df-var-locks-brick\") pod \"glance-default-internal-api-0\" (UID: \"e0cfe194-3f37-44ca-b774-d8c583d7b2df\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Oct 11 04:12:48 crc kubenswrapper[4703]: I1011 04:12:48.852372 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/e0cfe194-3f37-44ca-b774-d8c583d7b2df-run\") pod \"glance-default-internal-api-0\" (UID: \"e0cfe194-3f37-44ca-b774-d8c583d7b2df\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Oct 11 04:12:48 crc kubenswrapper[4703]: I1011 04:12:48.851796 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/e0cfe194-3f37-44ca-b774-d8c583d7b2df-dev\") pod \"glance-default-internal-api-0\" (UID: \"e0cfe194-3f37-44ca-b774-d8c583d7b2df\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Oct 11 04:12:48 crc kubenswrapper[4703]: I1011 04:12:48.852420 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e0cfe194-3f37-44ca-b774-d8c583d7b2df-lib-modules\") pod \"glance-default-internal-api-0\" (UID: \"e0cfe194-3f37-44ca-b774-d8c583d7b2df\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Oct 11 04:12:48 crc kubenswrapper[4703]: I1011 04:12:48.852446 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f6w8r\" (UniqueName: \"kubernetes.io/projected/e0cfe194-3f37-44ca-b774-d8c583d7b2df-kube-api-access-f6w8r\") pod \"glance-default-internal-api-0\" (UID: \"e0cfe194-3f37-44ca-b774-d8c583d7b2df\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Oct 11 04:12:48 crc kubenswrapper[4703]: I1011 04:12:48.852487 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e0cfe194-3f37-44ca-b774-d8c583d7b2df-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"e0cfe194-3f37-44ca-b774-d8c583d7b2df\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Oct 11 04:12:48 crc kubenswrapper[4703]: I1011 04:12:48.853069 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e0cfe194-3f37-44ca-b774-d8c583d7b2df-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"e0cfe194-3f37-44ca-b774-d8c583d7b2df\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Oct 11 04:12:48 crc kubenswrapper[4703]: I1011 04:12:48.853111 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e0cfe194-3f37-44ca-b774-d8c583d7b2df-lib-modules\") pod \"glance-default-internal-api-0\" (UID: \"e0cfe194-3f37-44ca-b774-d8c583d7b2df\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Oct 11 04:12:48 crc kubenswrapper[4703]: I1011 04:12:48.857700 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e0cfe194-3f37-44ca-b774-d8c583d7b2df-scripts\") pod \"glance-default-internal-api-0\" (UID: \"e0cfe194-3f37-44ca-b774-d8c583d7b2df\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Oct 11 04:12:48 crc kubenswrapper[4703]: I1011 04:12:48.862106 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0cfe194-3f37-44ca-b774-d8c583d7b2df-config-data\") pod \"glance-default-internal-api-0\" (UID: \"e0cfe194-3f37-44ca-b774-d8c583d7b2df\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Oct 11 04:12:48 crc kubenswrapper[4703]: I1011 04:12:48.876965 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"e0cfe194-3f37-44ca-b774-d8c583d7b2df\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Oct 11 04:12:48 crc kubenswrapper[4703]: I1011 04:12:48.892811 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"e0cfe194-3f37-44ca-b774-d8c583d7b2df\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Oct 11 04:12:48 crc kubenswrapper[4703]: I1011 04:12:48.892956 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f6w8r\" (UniqueName: \"kubernetes.io/projected/e0cfe194-3f37-44ca-b774-d8c583d7b2df-kube-api-access-f6w8r\") pod \"glance-default-internal-api-0\" (UID: \"e0cfe194-3f37-44ca-b774-d8c583d7b2df\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Oct 11 04:12:49 crc kubenswrapper[4703]: I1011 04:12:49.022633 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-0" Oct 11 04:12:49 crc kubenswrapper[4703]: I1011 04:12:49.499910 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Oct 11 04:12:49 crc kubenswrapper[4703]: W1011 04:12:49.509146 4703 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode0cfe194_3f37_44ca_b774_d8c583d7b2df.slice/crio-6a4ece76c1e6fb5135c5faf6ceacfa34eb3133e9102e9c00a73bb768a7eaf0ce WatchSource:0}: Error finding container 6a4ece76c1e6fb5135c5faf6ceacfa34eb3133e9102e9c00a73bb768a7eaf0ce: Status 404 returned error can't find the container with id 6a4ece76c1e6fb5135c5faf6ceacfa34eb3133e9102e9c00a73bb768a7eaf0ce Oct 11 04:12:49 crc kubenswrapper[4703]: I1011 04:12:49.548409 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="50a0634c-7268-4bc5-8c34-02fc4fdc9654" path="/var/lib/kubelet/pods/50a0634c-7268-4bc5-8c34-02fc4fdc9654/volumes" Oct 11 04:12:49 crc kubenswrapper[4703]: I1011 04:12:49.620020 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"e0cfe194-3f37-44ca-b774-d8c583d7b2df","Type":"ContainerStarted","Data":"6a4ece76c1e6fb5135c5faf6ceacfa34eb3133e9102e9c00a73bb768a7eaf0ce"} Oct 11 04:12:50 crc kubenswrapper[4703]: I1011 04:12:50.635634 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"e0cfe194-3f37-44ca-b774-d8c583d7b2df","Type":"ContainerStarted","Data":"9441ba7208e7dc5ebd3013ff034ed839648735a16087112c6e136cb7f2f6bb0e"} Oct 11 04:12:50 crc kubenswrapper[4703]: I1011 04:12:50.638127 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"e0cfe194-3f37-44ca-b774-d8c583d7b2df","Type":"ContainerStarted","Data":"cd2036a2f5b1630c24a800af76756dc4eafc9558f5a13d23820cefcd2f29f19e"} Oct 11 04:12:50 crc kubenswrapper[4703]: I1011 04:12:50.638319 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"e0cfe194-3f37-44ca-b774-d8c583d7b2df","Type":"ContainerStarted","Data":"091cd8a7a44ba265c23fb3b4636843ae241063ef27ca4a23bb7e4e30ab845005"} Oct 11 04:12:50 crc kubenswrapper[4703]: I1011 04:12:50.684930 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-internal-api-0" podStartSLOduration=2.684912574 podStartE2EDuration="2.684912574s" podCreationTimestamp="2025-10-11 04:12:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 04:12:50.68327567 +0000 UTC m=+1061.893757612" watchObservedRunningTime="2025-10-11 04:12:50.684912574 +0000 UTC m=+1061.895394506" Oct 11 04:12:54 crc kubenswrapper[4703]: I1011 04:12:54.048168 4703 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-external-api-0" Oct 11 04:12:54 crc kubenswrapper[4703]: I1011 04:12:54.048797 4703 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-external-api-0" Oct 11 04:12:54 crc kubenswrapper[4703]: I1011 04:12:54.048816 4703 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-external-api-0" Oct 11 04:12:54 crc kubenswrapper[4703]: I1011 04:12:54.087412 4703 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-external-api-0" Oct 11 04:12:54 crc kubenswrapper[4703]: I1011 04:12:54.089824 4703 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-external-api-0" Oct 11 04:12:54 crc kubenswrapper[4703]: I1011 04:12:54.112167 4703 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-external-api-0" Oct 11 04:12:54 crc kubenswrapper[4703]: I1011 04:12:54.685166 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-external-api-0" Oct 11 04:12:54 crc kubenswrapper[4703]: I1011 04:12:54.685695 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-external-api-0" Oct 11 04:12:54 crc kubenswrapper[4703]: I1011 04:12:54.685743 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-external-api-0" Oct 11 04:12:54 crc kubenswrapper[4703]: I1011 04:12:54.714809 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-external-api-0" Oct 11 04:12:54 crc kubenswrapper[4703]: I1011 04:12:54.715409 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-external-api-0" Oct 11 04:12:54 crc kubenswrapper[4703]: I1011 04:12:54.719051 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-external-api-0" Oct 11 04:12:59 crc kubenswrapper[4703]: I1011 04:12:59.023065 4703 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-internal-api-0" Oct 11 04:12:59 crc kubenswrapper[4703]: I1011 04:12:59.023876 4703 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-internal-api-0" Oct 11 04:12:59 crc kubenswrapper[4703]: I1011 04:12:59.023910 4703 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-internal-api-0" Oct 11 04:12:59 crc kubenswrapper[4703]: I1011 04:12:59.073069 4703 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-internal-api-0" Oct 11 04:12:59 crc kubenswrapper[4703]: I1011 04:12:59.073178 4703 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-internal-api-0" Oct 11 04:12:59 crc kubenswrapper[4703]: I1011 04:12:59.094802 4703 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-internal-api-0" Oct 11 04:12:59 crc kubenswrapper[4703]: I1011 04:12:59.744521 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-internal-api-0" Oct 11 04:12:59 crc kubenswrapper[4703]: I1011 04:12:59.744575 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-internal-api-0" Oct 11 04:12:59 crc kubenswrapper[4703]: I1011 04:12:59.744592 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-internal-api-0" Oct 11 04:12:59 crc kubenswrapper[4703]: I1011 04:12:59.760862 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-internal-api-0" Oct 11 04:12:59 crc kubenswrapper[4703]: I1011 04:12:59.766254 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-internal-api-0" Oct 11 04:12:59 crc kubenswrapper[4703]: I1011 04:12:59.766523 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-internal-api-0" Oct 11 04:13:50 crc kubenswrapper[4703]: I1011 04:13:50.255228 4703 patch_prober.go:28] interesting pod/machine-config-daemon-6b7d5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 11 04:13:50 crc kubenswrapper[4703]: I1011 04:13:50.256802 4703 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6b7d5" podUID="74f832fc-6791-47d6-a9b3-07d923e053dc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 11 04:14:20 crc kubenswrapper[4703]: I1011 04:14:20.254953 4703 patch_prober.go:28] interesting pod/machine-config-daemon-6b7d5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 11 04:14:20 crc kubenswrapper[4703]: I1011 04:14:20.255624 4703 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6b7d5" podUID="74f832fc-6791-47d6-a9b3-07d923e053dc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 11 04:14:50 crc kubenswrapper[4703]: I1011 04:14:50.254814 4703 patch_prober.go:28] interesting pod/machine-config-daemon-6b7d5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 11 04:14:50 crc kubenswrapper[4703]: I1011 04:14:50.255402 4703 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6b7d5" podUID="74f832fc-6791-47d6-a9b3-07d923e053dc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 11 04:14:50 crc kubenswrapper[4703]: I1011 04:14:50.255503 4703 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-6b7d5" Oct 11 04:14:50 crc kubenswrapper[4703]: I1011 04:14:50.256384 4703 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"399509657f5ec21d3ece8fb671d017bd6349c4dadb4c055bb8e50807bdb71cf9"} pod="openshift-machine-config-operator/machine-config-daemon-6b7d5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 11 04:14:50 crc kubenswrapper[4703]: I1011 04:14:50.256513 4703 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-6b7d5" podUID="74f832fc-6791-47d6-a9b3-07d923e053dc" containerName="machine-config-daemon" containerID="cri-o://399509657f5ec21d3ece8fb671d017bd6349c4dadb4c055bb8e50807bdb71cf9" gracePeriod=600 Oct 11 04:14:50 crc kubenswrapper[4703]: I1011 04:14:50.831774 4703 generic.go:334] "Generic (PLEG): container finished" podID="74f832fc-6791-47d6-a9b3-07d923e053dc" containerID="399509657f5ec21d3ece8fb671d017bd6349c4dadb4c055bb8e50807bdb71cf9" exitCode=0 Oct 11 04:14:50 crc kubenswrapper[4703]: I1011 04:14:50.831842 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6b7d5" event={"ID":"74f832fc-6791-47d6-a9b3-07d923e053dc","Type":"ContainerDied","Data":"399509657f5ec21d3ece8fb671d017bd6349c4dadb4c055bb8e50807bdb71cf9"} Oct 11 04:14:50 crc kubenswrapper[4703]: I1011 04:14:50.832160 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6b7d5" event={"ID":"74f832fc-6791-47d6-a9b3-07d923e053dc","Type":"ContainerStarted","Data":"47760bd5e163e98f5c9c154e6e147294e18b376d4efe7d50aef86addf0f66969"} Oct 11 04:14:50 crc kubenswrapper[4703]: I1011 04:14:50.832197 4703 scope.go:117] "RemoveContainer" containerID="16bd76b9758ba6b1af68c1c3f4a9fecfcc95697ade75d95a9f0798b6a633f439" Oct 11 04:15:00 crc kubenswrapper[4703]: I1011 04:15:00.151577 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29335935-4wfqt"] Oct 11 04:15:00 crc kubenswrapper[4703]: I1011 04:15:00.154299 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29335935-4wfqt" Oct 11 04:15:00 crc kubenswrapper[4703]: I1011 04:15:00.182654 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 11 04:15:00 crc kubenswrapper[4703]: I1011 04:15:00.186765 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 11 04:15:00 crc kubenswrapper[4703]: I1011 04:15:00.188198 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29335935-4wfqt"] Oct 11 04:15:00 crc kubenswrapper[4703]: I1011 04:15:00.256623 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bb80cf53-19af-4abe-9c18-27c68af042e5-config-volume\") pod \"collect-profiles-29335935-4wfqt\" (UID: \"bb80cf53-19af-4abe-9c18-27c68af042e5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29335935-4wfqt" Oct 11 04:15:00 crc kubenswrapper[4703]: I1011 04:15:00.256737 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vjfj5\" (UniqueName: \"kubernetes.io/projected/bb80cf53-19af-4abe-9c18-27c68af042e5-kube-api-access-vjfj5\") pod \"collect-profiles-29335935-4wfqt\" (UID: \"bb80cf53-19af-4abe-9c18-27c68af042e5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29335935-4wfqt" Oct 11 04:15:00 crc kubenswrapper[4703]: I1011 04:15:00.256777 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bb80cf53-19af-4abe-9c18-27c68af042e5-secret-volume\") pod \"collect-profiles-29335935-4wfqt\" (UID: \"bb80cf53-19af-4abe-9c18-27c68af042e5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29335935-4wfqt" Oct 11 04:15:00 crc kubenswrapper[4703]: I1011 04:15:00.358193 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bb80cf53-19af-4abe-9c18-27c68af042e5-config-volume\") pod \"collect-profiles-29335935-4wfqt\" (UID: \"bb80cf53-19af-4abe-9c18-27c68af042e5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29335935-4wfqt" Oct 11 04:15:00 crc kubenswrapper[4703]: I1011 04:15:00.358273 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vjfj5\" (UniqueName: \"kubernetes.io/projected/bb80cf53-19af-4abe-9c18-27c68af042e5-kube-api-access-vjfj5\") pod \"collect-profiles-29335935-4wfqt\" (UID: \"bb80cf53-19af-4abe-9c18-27c68af042e5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29335935-4wfqt" Oct 11 04:15:00 crc kubenswrapper[4703]: I1011 04:15:00.358321 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bb80cf53-19af-4abe-9c18-27c68af042e5-secret-volume\") pod \"collect-profiles-29335935-4wfqt\" (UID: \"bb80cf53-19af-4abe-9c18-27c68af042e5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29335935-4wfqt" Oct 11 04:15:00 crc kubenswrapper[4703]: I1011 04:15:00.360219 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bb80cf53-19af-4abe-9c18-27c68af042e5-config-volume\") pod \"collect-profiles-29335935-4wfqt\" (UID: \"bb80cf53-19af-4abe-9c18-27c68af042e5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29335935-4wfqt" Oct 11 04:15:00 crc kubenswrapper[4703]: I1011 04:15:00.370434 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bb80cf53-19af-4abe-9c18-27c68af042e5-secret-volume\") pod \"collect-profiles-29335935-4wfqt\" (UID: \"bb80cf53-19af-4abe-9c18-27c68af042e5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29335935-4wfqt" Oct 11 04:15:00 crc kubenswrapper[4703]: I1011 04:15:00.396790 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vjfj5\" (UniqueName: \"kubernetes.io/projected/bb80cf53-19af-4abe-9c18-27c68af042e5-kube-api-access-vjfj5\") pod \"collect-profiles-29335935-4wfqt\" (UID: \"bb80cf53-19af-4abe-9c18-27c68af042e5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29335935-4wfqt" Oct 11 04:15:00 crc kubenswrapper[4703]: I1011 04:15:00.514889 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29335935-4wfqt" Oct 11 04:15:00 crc kubenswrapper[4703]: I1011 04:15:00.994121 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29335935-4wfqt"] Oct 11 04:15:01 crc kubenswrapper[4703]: W1011 04:15:01.004227 4703 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbb80cf53_19af_4abe_9c18_27c68af042e5.slice/crio-3e2f166e11de1eca7c5092f68dc3ea1c124f3a6b84fd9d0363137ec133a71fac WatchSource:0}: Error finding container 3e2f166e11de1eca7c5092f68dc3ea1c124f3a6b84fd9d0363137ec133a71fac: Status 404 returned error can't find the container with id 3e2f166e11de1eca7c5092f68dc3ea1c124f3a6b84fd9d0363137ec133a71fac Oct 11 04:15:01 crc kubenswrapper[4703]: I1011 04:15:01.947963 4703 generic.go:334] "Generic (PLEG): container finished" podID="bb80cf53-19af-4abe-9c18-27c68af042e5" containerID="bd6fd969f6e8724fbab028b5817dd0b659e20e9e5d4e5491a5dfb9879956d261" exitCode=0 Oct 11 04:15:01 crc kubenswrapper[4703]: I1011 04:15:01.948072 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29335935-4wfqt" event={"ID":"bb80cf53-19af-4abe-9c18-27c68af042e5","Type":"ContainerDied","Data":"bd6fd969f6e8724fbab028b5817dd0b659e20e9e5d4e5491a5dfb9879956d261"} Oct 11 04:15:01 crc kubenswrapper[4703]: I1011 04:15:01.948827 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29335935-4wfqt" event={"ID":"bb80cf53-19af-4abe-9c18-27c68af042e5","Type":"ContainerStarted","Data":"3e2f166e11de1eca7c5092f68dc3ea1c124f3a6b84fd9d0363137ec133a71fac"} Oct 11 04:15:03 crc kubenswrapper[4703]: I1011 04:15:03.276365 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29335935-4wfqt" Oct 11 04:15:03 crc kubenswrapper[4703]: I1011 04:15:03.311335 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bb80cf53-19af-4abe-9c18-27c68af042e5-secret-volume\") pod \"bb80cf53-19af-4abe-9c18-27c68af042e5\" (UID: \"bb80cf53-19af-4abe-9c18-27c68af042e5\") " Oct 11 04:15:03 crc kubenswrapper[4703]: I1011 04:15:03.311508 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vjfj5\" (UniqueName: \"kubernetes.io/projected/bb80cf53-19af-4abe-9c18-27c68af042e5-kube-api-access-vjfj5\") pod \"bb80cf53-19af-4abe-9c18-27c68af042e5\" (UID: \"bb80cf53-19af-4abe-9c18-27c68af042e5\") " Oct 11 04:15:03 crc kubenswrapper[4703]: I1011 04:15:03.311528 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bb80cf53-19af-4abe-9c18-27c68af042e5-config-volume\") pod \"bb80cf53-19af-4abe-9c18-27c68af042e5\" (UID: \"bb80cf53-19af-4abe-9c18-27c68af042e5\") " Oct 11 04:15:03 crc kubenswrapper[4703]: I1011 04:15:03.312893 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bb80cf53-19af-4abe-9c18-27c68af042e5-config-volume" (OuterVolumeSpecName: "config-volume") pod "bb80cf53-19af-4abe-9c18-27c68af042e5" (UID: "bb80cf53-19af-4abe-9c18-27c68af042e5"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 04:15:03 crc kubenswrapper[4703]: I1011 04:15:03.317301 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb80cf53-19af-4abe-9c18-27c68af042e5-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "bb80cf53-19af-4abe-9c18-27c68af042e5" (UID: "bb80cf53-19af-4abe-9c18-27c68af042e5"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 04:15:03 crc kubenswrapper[4703]: I1011 04:15:03.318616 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb80cf53-19af-4abe-9c18-27c68af042e5-kube-api-access-vjfj5" (OuterVolumeSpecName: "kube-api-access-vjfj5") pod "bb80cf53-19af-4abe-9c18-27c68af042e5" (UID: "bb80cf53-19af-4abe-9c18-27c68af042e5"). InnerVolumeSpecName "kube-api-access-vjfj5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 04:15:03 crc kubenswrapper[4703]: I1011 04:15:03.412571 4703 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bb80cf53-19af-4abe-9c18-27c68af042e5-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 11 04:15:03 crc kubenswrapper[4703]: I1011 04:15:03.412617 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vjfj5\" (UniqueName: \"kubernetes.io/projected/bb80cf53-19af-4abe-9c18-27c68af042e5-kube-api-access-vjfj5\") on node \"crc\" DevicePath \"\"" Oct 11 04:15:03 crc kubenswrapper[4703]: I1011 04:15:03.412629 4703 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bb80cf53-19af-4abe-9c18-27c68af042e5-config-volume\") on node \"crc\" DevicePath \"\"" Oct 11 04:15:03 crc kubenswrapper[4703]: I1011 04:15:03.972073 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29335935-4wfqt" event={"ID":"bb80cf53-19af-4abe-9c18-27c68af042e5","Type":"ContainerDied","Data":"3e2f166e11de1eca7c5092f68dc3ea1c124f3a6b84fd9d0363137ec133a71fac"} Oct 11 04:15:03 crc kubenswrapper[4703]: I1011 04:15:03.972132 4703 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3e2f166e11de1eca7c5092f68dc3ea1c124f3a6b84fd9d0363137ec133a71fac" Oct 11 04:15:03 crc kubenswrapper[4703]: I1011 04:15:03.972152 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29335935-4wfqt" Oct 11 04:16:10 crc kubenswrapper[4703]: I1011 04:16:10.318781 4703 scope.go:117] "RemoveContainer" containerID="d7357289a1651775c7d6e340068a89380869665d9f1fa747a80d6e20c964521c" Oct 11 04:16:10 crc kubenswrapper[4703]: I1011 04:16:10.345811 4703 scope.go:117] "RemoveContainer" containerID="2d59bece9c709dc22b4670beec1a99c5c70b2d85849c636775fc658cf5f5b39a" Oct 11 04:16:50 crc kubenswrapper[4703]: I1011 04:16:50.255129 4703 patch_prober.go:28] interesting pod/machine-config-daemon-6b7d5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 11 04:16:50 crc kubenswrapper[4703]: I1011 04:16:50.255687 4703 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6b7d5" podUID="74f832fc-6791-47d6-a9b3-07d923e053dc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 11 04:17:10 crc kubenswrapper[4703]: I1011 04:17:10.450957 4703 scope.go:117] "RemoveContainer" containerID="4d62f6315776b366f8e29151b144f33260ed637d95896f626009e82d09aae0dd" Oct 11 04:17:15 crc kubenswrapper[4703]: I1011 04:17:15.660360 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-vcbvl"] Oct 11 04:17:15 crc kubenswrapper[4703]: E1011 04:17:15.661394 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb80cf53-19af-4abe-9c18-27c68af042e5" containerName="collect-profiles" Oct 11 04:17:15 crc kubenswrapper[4703]: I1011 04:17:15.661414 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb80cf53-19af-4abe-9c18-27c68af042e5" containerName="collect-profiles" Oct 11 04:17:15 crc kubenswrapper[4703]: I1011 04:17:15.661682 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb80cf53-19af-4abe-9c18-27c68af042e5" containerName="collect-profiles" Oct 11 04:17:15 crc kubenswrapper[4703]: I1011 04:17:15.663279 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vcbvl" Oct 11 04:17:15 crc kubenswrapper[4703]: I1011 04:17:15.683832 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vcbvl"] Oct 11 04:17:15 crc kubenswrapper[4703]: I1011 04:17:15.835715 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e03d2cd5-2ca8-45e3-b5fd-a7b91f8ece5c-utilities\") pod \"community-operators-vcbvl\" (UID: \"e03d2cd5-2ca8-45e3-b5fd-a7b91f8ece5c\") " pod="openshift-marketplace/community-operators-vcbvl" Oct 11 04:17:15 crc kubenswrapper[4703]: I1011 04:17:15.835769 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sbsnc\" (UniqueName: \"kubernetes.io/projected/e03d2cd5-2ca8-45e3-b5fd-a7b91f8ece5c-kube-api-access-sbsnc\") pod \"community-operators-vcbvl\" (UID: \"e03d2cd5-2ca8-45e3-b5fd-a7b91f8ece5c\") " pod="openshift-marketplace/community-operators-vcbvl" Oct 11 04:17:15 crc kubenswrapper[4703]: I1011 04:17:15.836187 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e03d2cd5-2ca8-45e3-b5fd-a7b91f8ece5c-catalog-content\") pod \"community-operators-vcbvl\" (UID: \"e03d2cd5-2ca8-45e3-b5fd-a7b91f8ece5c\") " pod="openshift-marketplace/community-operators-vcbvl" Oct 11 04:17:15 crc kubenswrapper[4703]: I1011 04:17:15.937711 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e03d2cd5-2ca8-45e3-b5fd-a7b91f8ece5c-utilities\") pod \"community-operators-vcbvl\" (UID: \"e03d2cd5-2ca8-45e3-b5fd-a7b91f8ece5c\") " pod="openshift-marketplace/community-operators-vcbvl" Oct 11 04:17:15 crc kubenswrapper[4703]: I1011 04:17:15.937774 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sbsnc\" (UniqueName: \"kubernetes.io/projected/e03d2cd5-2ca8-45e3-b5fd-a7b91f8ece5c-kube-api-access-sbsnc\") pod \"community-operators-vcbvl\" (UID: \"e03d2cd5-2ca8-45e3-b5fd-a7b91f8ece5c\") " pod="openshift-marketplace/community-operators-vcbvl" Oct 11 04:17:15 crc kubenswrapper[4703]: I1011 04:17:15.937899 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e03d2cd5-2ca8-45e3-b5fd-a7b91f8ece5c-catalog-content\") pod \"community-operators-vcbvl\" (UID: \"e03d2cd5-2ca8-45e3-b5fd-a7b91f8ece5c\") " pod="openshift-marketplace/community-operators-vcbvl" Oct 11 04:17:15 crc kubenswrapper[4703]: I1011 04:17:15.938257 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e03d2cd5-2ca8-45e3-b5fd-a7b91f8ece5c-utilities\") pod \"community-operators-vcbvl\" (UID: \"e03d2cd5-2ca8-45e3-b5fd-a7b91f8ece5c\") " pod="openshift-marketplace/community-operators-vcbvl" Oct 11 04:17:15 crc kubenswrapper[4703]: I1011 04:17:15.938352 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e03d2cd5-2ca8-45e3-b5fd-a7b91f8ece5c-catalog-content\") pod \"community-operators-vcbvl\" (UID: \"e03d2cd5-2ca8-45e3-b5fd-a7b91f8ece5c\") " pod="openshift-marketplace/community-operators-vcbvl" Oct 11 04:17:15 crc kubenswrapper[4703]: I1011 04:17:15.965492 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sbsnc\" (UniqueName: \"kubernetes.io/projected/e03d2cd5-2ca8-45e3-b5fd-a7b91f8ece5c-kube-api-access-sbsnc\") pod \"community-operators-vcbvl\" (UID: \"e03d2cd5-2ca8-45e3-b5fd-a7b91f8ece5c\") " pod="openshift-marketplace/community-operators-vcbvl" Oct 11 04:17:15 crc kubenswrapper[4703]: I1011 04:17:15.992568 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vcbvl" Oct 11 04:17:16 crc kubenswrapper[4703]: I1011 04:17:16.533189 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vcbvl"] Oct 11 04:17:16 crc kubenswrapper[4703]: W1011 04:17:16.536389 4703 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode03d2cd5_2ca8_45e3_b5fd_a7b91f8ece5c.slice/crio-87a9723d38cf6fec0a774ca6f0e96c76579e6e20a81f80dc63c9578cf28bd183 WatchSource:0}: Error finding container 87a9723d38cf6fec0a774ca6f0e96c76579e6e20a81f80dc63c9578cf28bd183: Status 404 returned error can't find the container with id 87a9723d38cf6fec0a774ca6f0e96c76579e6e20a81f80dc63c9578cf28bd183 Oct 11 04:17:17 crc kubenswrapper[4703]: I1011 04:17:17.264968 4703 generic.go:334] "Generic (PLEG): container finished" podID="e03d2cd5-2ca8-45e3-b5fd-a7b91f8ece5c" containerID="cb24165e77d2101bd27b7df3f1a511c3c3407c1c1ae57897f2131e7167b90ae8" exitCode=0 Oct 11 04:17:17 crc kubenswrapper[4703]: I1011 04:17:17.265025 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vcbvl" event={"ID":"e03d2cd5-2ca8-45e3-b5fd-a7b91f8ece5c","Type":"ContainerDied","Data":"cb24165e77d2101bd27b7df3f1a511c3c3407c1c1ae57897f2131e7167b90ae8"} Oct 11 04:17:17 crc kubenswrapper[4703]: I1011 04:17:17.265079 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vcbvl" event={"ID":"e03d2cd5-2ca8-45e3-b5fd-a7b91f8ece5c","Type":"ContainerStarted","Data":"87a9723d38cf6fec0a774ca6f0e96c76579e6e20a81f80dc63c9578cf28bd183"} Oct 11 04:17:17 crc kubenswrapper[4703]: I1011 04:17:17.267054 4703 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 11 04:17:18 crc kubenswrapper[4703]: I1011 04:17:18.275369 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vcbvl" event={"ID":"e03d2cd5-2ca8-45e3-b5fd-a7b91f8ece5c","Type":"ContainerStarted","Data":"bbb4ec87829505858d4403d4e24357e97ded8d414cc0b8420f7ac1f37c424c81"} Oct 11 04:17:19 crc kubenswrapper[4703]: I1011 04:17:19.287035 4703 generic.go:334] "Generic (PLEG): container finished" podID="e03d2cd5-2ca8-45e3-b5fd-a7b91f8ece5c" containerID="bbb4ec87829505858d4403d4e24357e97ded8d414cc0b8420f7ac1f37c424c81" exitCode=0 Oct 11 04:17:19 crc kubenswrapper[4703]: I1011 04:17:19.287106 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vcbvl" event={"ID":"e03d2cd5-2ca8-45e3-b5fd-a7b91f8ece5c","Type":"ContainerDied","Data":"bbb4ec87829505858d4403d4e24357e97ded8d414cc0b8420f7ac1f37c424c81"} Oct 11 04:17:20 crc kubenswrapper[4703]: I1011 04:17:20.255170 4703 patch_prober.go:28] interesting pod/machine-config-daemon-6b7d5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 11 04:17:20 crc kubenswrapper[4703]: I1011 04:17:20.255594 4703 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6b7d5" podUID="74f832fc-6791-47d6-a9b3-07d923e053dc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 11 04:17:20 crc kubenswrapper[4703]: I1011 04:17:20.296533 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vcbvl" event={"ID":"e03d2cd5-2ca8-45e3-b5fd-a7b91f8ece5c","Type":"ContainerStarted","Data":"c16ece6718a13a6de59ee6f993e2aca02c8025c0f48a07d972ba50beeb702b40"} Oct 11 04:17:20 crc kubenswrapper[4703]: I1011 04:17:20.325857 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-vcbvl" podStartSLOduration=2.918959664 podStartE2EDuration="5.325833161s" podCreationTimestamp="2025-10-11 04:17:15 +0000 UTC" firstStartedPulling="2025-10-11 04:17:17.266781948 +0000 UTC m=+1328.477263870" lastFinishedPulling="2025-10-11 04:17:19.673655435 +0000 UTC m=+1330.884137367" observedRunningTime="2025-10-11 04:17:20.317819333 +0000 UTC m=+1331.528301265" watchObservedRunningTime="2025-10-11 04:17:20.325833161 +0000 UTC m=+1331.536315093" Oct 11 04:17:25 crc kubenswrapper[4703]: I1011 04:17:25.993771 4703 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-vcbvl" Oct 11 04:17:25 crc kubenswrapper[4703]: I1011 04:17:25.994395 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-vcbvl" Oct 11 04:17:26 crc kubenswrapper[4703]: I1011 04:17:26.077386 4703 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-vcbvl" Oct 11 04:17:26 crc kubenswrapper[4703]: I1011 04:17:26.450280 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-vcbvl" Oct 11 04:17:26 crc kubenswrapper[4703]: I1011 04:17:26.505689 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vcbvl"] Oct 11 04:17:28 crc kubenswrapper[4703]: I1011 04:17:28.386100 4703 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-vcbvl" podUID="e03d2cd5-2ca8-45e3-b5fd-a7b91f8ece5c" containerName="registry-server" containerID="cri-o://c16ece6718a13a6de59ee6f993e2aca02c8025c0f48a07d972ba50beeb702b40" gracePeriod=2 Oct 11 04:17:28 crc kubenswrapper[4703]: I1011 04:17:28.823681 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vcbvl" Oct 11 04:17:28 crc kubenswrapper[4703]: I1011 04:17:28.975503 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e03d2cd5-2ca8-45e3-b5fd-a7b91f8ece5c-catalog-content\") pod \"e03d2cd5-2ca8-45e3-b5fd-a7b91f8ece5c\" (UID: \"e03d2cd5-2ca8-45e3-b5fd-a7b91f8ece5c\") " Oct 11 04:17:28 crc kubenswrapper[4703]: I1011 04:17:28.975913 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e03d2cd5-2ca8-45e3-b5fd-a7b91f8ece5c-utilities\") pod \"e03d2cd5-2ca8-45e3-b5fd-a7b91f8ece5c\" (UID: \"e03d2cd5-2ca8-45e3-b5fd-a7b91f8ece5c\") " Oct 11 04:17:28 crc kubenswrapper[4703]: I1011 04:17:28.975945 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sbsnc\" (UniqueName: \"kubernetes.io/projected/e03d2cd5-2ca8-45e3-b5fd-a7b91f8ece5c-kube-api-access-sbsnc\") pod \"e03d2cd5-2ca8-45e3-b5fd-a7b91f8ece5c\" (UID: \"e03d2cd5-2ca8-45e3-b5fd-a7b91f8ece5c\") " Oct 11 04:17:28 crc kubenswrapper[4703]: I1011 04:17:28.977299 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e03d2cd5-2ca8-45e3-b5fd-a7b91f8ece5c-utilities" (OuterVolumeSpecName: "utilities") pod "e03d2cd5-2ca8-45e3-b5fd-a7b91f8ece5c" (UID: "e03d2cd5-2ca8-45e3-b5fd-a7b91f8ece5c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 04:17:28 crc kubenswrapper[4703]: I1011 04:17:28.985390 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e03d2cd5-2ca8-45e3-b5fd-a7b91f8ece5c-kube-api-access-sbsnc" (OuterVolumeSpecName: "kube-api-access-sbsnc") pod "e03d2cd5-2ca8-45e3-b5fd-a7b91f8ece5c" (UID: "e03d2cd5-2ca8-45e3-b5fd-a7b91f8ece5c"). InnerVolumeSpecName "kube-api-access-sbsnc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 04:17:29 crc kubenswrapper[4703]: I1011 04:17:29.040912 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e03d2cd5-2ca8-45e3-b5fd-a7b91f8ece5c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e03d2cd5-2ca8-45e3-b5fd-a7b91f8ece5c" (UID: "e03d2cd5-2ca8-45e3-b5fd-a7b91f8ece5c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 04:17:29 crc kubenswrapper[4703]: I1011 04:17:29.077148 4703 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e03d2cd5-2ca8-45e3-b5fd-a7b91f8ece5c-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 11 04:17:29 crc kubenswrapper[4703]: I1011 04:17:29.077179 4703 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e03d2cd5-2ca8-45e3-b5fd-a7b91f8ece5c-utilities\") on node \"crc\" DevicePath \"\"" Oct 11 04:17:29 crc kubenswrapper[4703]: I1011 04:17:29.077190 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sbsnc\" (UniqueName: \"kubernetes.io/projected/e03d2cd5-2ca8-45e3-b5fd-a7b91f8ece5c-kube-api-access-sbsnc\") on node \"crc\" DevicePath \"\"" Oct 11 04:17:29 crc kubenswrapper[4703]: I1011 04:17:29.398089 4703 generic.go:334] "Generic (PLEG): container finished" podID="e03d2cd5-2ca8-45e3-b5fd-a7b91f8ece5c" containerID="c16ece6718a13a6de59ee6f993e2aca02c8025c0f48a07d972ba50beeb702b40" exitCode=0 Oct 11 04:17:29 crc kubenswrapper[4703]: I1011 04:17:29.398131 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vcbvl" event={"ID":"e03d2cd5-2ca8-45e3-b5fd-a7b91f8ece5c","Type":"ContainerDied","Data":"c16ece6718a13a6de59ee6f993e2aca02c8025c0f48a07d972ba50beeb702b40"} Oct 11 04:17:29 crc kubenswrapper[4703]: I1011 04:17:29.398161 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vcbvl" event={"ID":"e03d2cd5-2ca8-45e3-b5fd-a7b91f8ece5c","Type":"ContainerDied","Data":"87a9723d38cf6fec0a774ca6f0e96c76579e6e20a81f80dc63c9578cf28bd183"} Oct 11 04:17:29 crc kubenswrapper[4703]: I1011 04:17:29.398181 4703 scope.go:117] "RemoveContainer" containerID="c16ece6718a13a6de59ee6f993e2aca02c8025c0f48a07d972ba50beeb702b40" Oct 11 04:17:29 crc kubenswrapper[4703]: I1011 04:17:29.398300 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vcbvl" Oct 11 04:17:29 crc kubenswrapper[4703]: I1011 04:17:29.437026 4703 scope.go:117] "RemoveContainer" containerID="bbb4ec87829505858d4403d4e24357e97ded8d414cc0b8420f7ac1f37c424c81" Oct 11 04:17:29 crc kubenswrapper[4703]: I1011 04:17:29.446630 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vcbvl"] Oct 11 04:17:29 crc kubenswrapper[4703]: I1011 04:17:29.461017 4703 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-vcbvl"] Oct 11 04:17:29 crc kubenswrapper[4703]: I1011 04:17:29.463400 4703 scope.go:117] "RemoveContainer" containerID="cb24165e77d2101bd27b7df3f1a511c3c3407c1c1ae57897f2131e7167b90ae8" Oct 11 04:17:29 crc kubenswrapper[4703]: I1011 04:17:29.507268 4703 scope.go:117] "RemoveContainer" containerID="c16ece6718a13a6de59ee6f993e2aca02c8025c0f48a07d972ba50beeb702b40" Oct 11 04:17:29 crc kubenswrapper[4703]: E1011 04:17:29.507624 4703 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c16ece6718a13a6de59ee6f993e2aca02c8025c0f48a07d972ba50beeb702b40\": container with ID starting with c16ece6718a13a6de59ee6f993e2aca02c8025c0f48a07d972ba50beeb702b40 not found: ID does not exist" containerID="c16ece6718a13a6de59ee6f993e2aca02c8025c0f48a07d972ba50beeb702b40" Oct 11 04:17:29 crc kubenswrapper[4703]: I1011 04:17:29.507664 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c16ece6718a13a6de59ee6f993e2aca02c8025c0f48a07d972ba50beeb702b40"} err="failed to get container status \"c16ece6718a13a6de59ee6f993e2aca02c8025c0f48a07d972ba50beeb702b40\": rpc error: code = NotFound desc = could not find container \"c16ece6718a13a6de59ee6f993e2aca02c8025c0f48a07d972ba50beeb702b40\": container with ID starting with c16ece6718a13a6de59ee6f993e2aca02c8025c0f48a07d972ba50beeb702b40 not found: ID does not exist" Oct 11 04:17:29 crc kubenswrapper[4703]: I1011 04:17:29.507689 4703 scope.go:117] "RemoveContainer" containerID="bbb4ec87829505858d4403d4e24357e97ded8d414cc0b8420f7ac1f37c424c81" Oct 11 04:17:29 crc kubenswrapper[4703]: E1011 04:17:29.508152 4703 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bbb4ec87829505858d4403d4e24357e97ded8d414cc0b8420f7ac1f37c424c81\": container with ID starting with bbb4ec87829505858d4403d4e24357e97ded8d414cc0b8420f7ac1f37c424c81 not found: ID does not exist" containerID="bbb4ec87829505858d4403d4e24357e97ded8d414cc0b8420f7ac1f37c424c81" Oct 11 04:17:29 crc kubenswrapper[4703]: I1011 04:17:29.508178 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bbb4ec87829505858d4403d4e24357e97ded8d414cc0b8420f7ac1f37c424c81"} err="failed to get container status \"bbb4ec87829505858d4403d4e24357e97ded8d414cc0b8420f7ac1f37c424c81\": rpc error: code = NotFound desc = could not find container \"bbb4ec87829505858d4403d4e24357e97ded8d414cc0b8420f7ac1f37c424c81\": container with ID starting with bbb4ec87829505858d4403d4e24357e97ded8d414cc0b8420f7ac1f37c424c81 not found: ID does not exist" Oct 11 04:17:29 crc kubenswrapper[4703]: I1011 04:17:29.508195 4703 scope.go:117] "RemoveContainer" containerID="cb24165e77d2101bd27b7df3f1a511c3c3407c1c1ae57897f2131e7167b90ae8" Oct 11 04:17:29 crc kubenswrapper[4703]: E1011 04:17:29.508691 4703 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb24165e77d2101bd27b7df3f1a511c3c3407c1c1ae57897f2131e7167b90ae8\": container with ID starting with cb24165e77d2101bd27b7df3f1a511c3c3407c1c1ae57897f2131e7167b90ae8 not found: ID does not exist" containerID="cb24165e77d2101bd27b7df3f1a511c3c3407c1c1ae57897f2131e7167b90ae8" Oct 11 04:17:29 crc kubenswrapper[4703]: I1011 04:17:29.508720 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb24165e77d2101bd27b7df3f1a511c3c3407c1c1ae57897f2131e7167b90ae8"} err="failed to get container status \"cb24165e77d2101bd27b7df3f1a511c3c3407c1c1ae57897f2131e7167b90ae8\": rpc error: code = NotFound desc = could not find container \"cb24165e77d2101bd27b7df3f1a511c3c3407c1c1ae57897f2131e7167b90ae8\": container with ID starting with cb24165e77d2101bd27b7df3f1a511c3c3407c1c1ae57897f2131e7167b90ae8 not found: ID does not exist" Oct 11 04:17:29 crc kubenswrapper[4703]: I1011 04:17:29.545504 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e03d2cd5-2ca8-45e3-b5fd-a7b91f8ece5c" path="/var/lib/kubelet/pods/e03d2cd5-2ca8-45e3-b5fd-a7b91f8ece5c/volumes" Oct 11 04:17:31 crc kubenswrapper[4703]: I1011 04:17:31.753957 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-n7wfl"] Oct 11 04:17:31 crc kubenswrapper[4703]: E1011 04:17:31.754700 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e03d2cd5-2ca8-45e3-b5fd-a7b91f8ece5c" containerName="extract-content" Oct 11 04:17:31 crc kubenswrapper[4703]: I1011 04:17:31.754719 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="e03d2cd5-2ca8-45e3-b5fd-a7b91f8ece5c" containerName="extract-content" Oct 11 04:17:31 crc kubenswrapper[4703]: E1011 04:17:31.754742 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e03d2cd5-2ca8-45e3-b5fd-a7b91f8ece5c" containerName="extract-utilities" Oct 11 04:17:31 crc kubenswrapper[4703]: I1011 04:17:31.754755 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="e03d2cd5-2ca8-45e3-b5fd-a7b91f8ece5c" containerName="extract-utilities" Oct 11 04:17:31 crc kubenswrapper[4703]: E1011 04:17:31.754783 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e03d2cd5-2ca8-45e3-b5fd-a7b91f8ece5c" containerName="registry-server" Oct 11 04:17:31 crc kubenswrapper[4703]: I1011 04:17:31.754793 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="e03d2cd5-2ca8-45e3-b5fd-a7b91f8ece5c" containerName="registry-server" Oct 11 04:17:31 crc kubenswrapper[4703]: I1011 04:17:31.755000 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="e03d2cd5-2ca8-45e3-b5fd-a7b91f8ece5c" containerName="registry-server" Oct 11 04:17:31 crc kubenswrapper[4703]: I1011 04:17:31.763091 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-n7wfl" Oct 11 04:17:31 crc kubenswrapper[4703]: I1011 04:17:31.791340 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-n7wfl"] Oct 11 04:17:31 crc kubenswrapper[4703]: I1011 04:17:31.931606 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/25288c8b-aea6-4d3b-bb66-5103389f442a-utilities\") pod \"certified-operators-n7wfl\" (UID: \"25288c8b-aea6-4d3b-bb66-5103389f442a\") " pod="openshift-marketplace/certified-operators-n7wfl" Oct 11 04:17:31 crc kubenswrapper[4703]: I1011 04:17:31.931860 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/25288c8b-aea6-4d3b-bb66-5103389f442a-catalog-content\") pod \"certified-operators-n7wfl\" (UID: \"25288c8b-aea6-4d3b-bb66-5103389f442a\") " pod="openshift-marketplace/certified-operators-n7wfl" Oct 11 04:17:31 crc kubenswrapper[4703]: I1011 04:17:31.931951 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-grmsx\" (UniqueName: \"kubernetes.io/projected/25288c8b-aea6-4d3b-bb66-5103389f442a-kube-api-access-grmsx\") pod \"certified-operators-n7wfl\" (UID: \"25288c8b-aea6-4d3b-bb66-5103389f442a\") " pod="openshift-marketplace/certified-operators-n7wfl" Oct 11 04:17:32 crc kubenswrapper[4703]: I1011 04:17:32.032643 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-grmsx\" (UniqueName: \"kubernetes.io/projected/25288c8b-aea6-4d3b-bb66-5103389f442a-kube-api-access-grmsx\") pod \"certified-operators-n7wfl\" (UID: \"25288c8b-aea6-4d3b-bb66-5103389f442a\") " pod="openshift-marketplace/certified-operators-n7wfl" Oct 11 04:17:32 crc kubenswrapper[4703]: I1011 04:17:32.032791 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/25288c8b-aea6-4d3b-bb66-5103389f442a-utilities\") pod \"certified-operators-n7wfl\" (UID: \"25288c8b-aea6-4d3b-bb66-5103389f442a\") " pod="openshift-marketplace/certified-operators-n7wfl" Oct 11 04:17:32 crc kubenswrapper[4703]: I1011 04:17:32.032857 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/25288c8b-aea6-4d3b-bb66-5103389f442a-catalog-content\") pod \"certified-operators-n7wfl\" (UID: \"25288c8b-aea6-4d3b-bb66-5103389f442a\") " pod="openshift-marketplace/certified-operators-n7wfl" Oct 11 04:17:32 crc kubenswrapper[4703]: I1011 04:17:32.033427 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/25288c8b-aea6-4d3b-bb66-5103389f442a-catalog-content\") pod \"certified-operators-n7wfl\" (UID: \"25288c8b-aea6-4d3b-bb66-5103389f442a\") " pod="openshift-marketplace/certified-operators-n7wfl" Oct 11 04:17:32 crc kubenswrapper[4703]: I1011 04:17:32.033533 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/25288c8b-aea6-4d3b-bb66-5103389f442a-utilities\") pod \"certified-operators-n7wfl\" (UID: \"25288c8b-aea6-4d3b-bb66-5103389f442a\") " pod="openshift-marketplace/certified-operators-n7wfl" Oct 11 04:17:32 crc kubenswrapper[4703]: I1011 04:17:32.057730 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-grmsx\" (UniqueName: \"kubernetes.io/projected/25288c8b-aea6-4d3b-bb66-5103389f442a-kube-api-access-grmsx\") pod \"certified-operators-n7wfl\" (UID: \"25288c8b-aea6-4d3b-bb66-5103389f442a\") " pod="openshift-marketplace/certified-operators-n7wfl" Oct 11 04:17:32 crc kubenswrapper[4703]: I1011 04:17:32.085999 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-n7wfl" Oct 11 04:17:32 crc kubenswrapper[4703]: I1011 04:17:32.537273 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-n7wfl"] Oct 11 04:17:33 crc kubenswrapper[4703]: I1011 04:17:33.444178 4703 generic.go:334] "Generic (PLEG): container finished" podID="25288c8b-aea6-4d3b-bb66-5103389f442a" containerID="4109afcd21d3d4464e274cf21601bb30aabb4468b661f2e4df34406429485988" exitCode=0 Oct 11 04:17:33 crc kubenswrapper[4703]: I1011 04:17:33.444296 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-n7wfl" event={"ID":"25288c8b-aea6-4d3b-bb66-5103389f442a","Type":"ContainerDied","Data":"4109afcd21d3d4464e274cf21601bb30aabb4468b661f2e4df34406429485988"} Oct 11 04:17:33 crc kubenswrapper[4703]: I1011 04:17:33.444440 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-n7wfl" event={"ID":"25288c8b-aea6-4d3b-bb66-5103389f442a","Type":"ContainerStarted","Data":"332d3672da9fdd73474ca32b4586c28c56f9b5eb25894a0d7dafe3617a60f084"} Oct 11 04:17:34 crc kubenswrapper[4703]: I1011 04:17:34.457058 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-n7wfl" event={"ID":"25288c8b-aea6-4d3b-bb66-5103389f442a","Type":"ContainerStarted","Data":"57a8876013aed102b27399915db8d8cf3872193001f69a43aa4c0c4078d148a1"} Oct 11 04:17:35 crc kubenswrapper[4703]: I1011 04:17:35.471272 4703 generic.go:334] "Generic (PLEG): container finished" podID="25288c8b-aea6-4d3b-bb66-5103389f442a" containerID="57a8876013aed102b27399915db8d8cf3872193001f69a43aa4c0c4078d148a1" exitCode=0 Oct 11 04:17:35 crc kubenswrapper[4703]: I1011 04:17:35.471360 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-n7wfl" event={"ID":"25288c8b-aea6-4d3b-bb66-5103389f442a","Type":"ContainerDied","Data":"57a8876013aed102b27399915db8d8cf3872193001f69a43aa4c0c4078d148a1"} Oct 11 04:17:35 crc kubenswrapper[4703]: I1011 04:17:35.471849 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-n7wfl" event={"ID":"25288c8b-aea6-4d3b-bb66-5103389f442a","Type":"ContainerStarted","Data":"a429889a8376563f0f73063a174c8f3db3e11590ec0f9da261c74c87bc9f41ee"} Oct 11 04:17:41 crc kubenswrapper[4703]: I1011 04:17:41.253098 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-n7wfl" podStartSLOduration=8.853166383 podStartE2EDuration="10.253072543s" podCreationTimestamp="2025-10-11 04:17:31 +0000 UTC" firstStartedPulling="2025-10-11 04:17:33.446885448 +0000 UTC m=+1344.657367370" lastFinishedPulling="2025-10-11 04:17:34.846791618 +0000 UTC m=+1346.057273530" observedRunningTime="2025-10-11 04:17:35.495563966 +0000 UTC m=+1346.706045898" watchObservedRunningTime="2025-10-11 04:17:41.253072543 +0000 UTC m=+1352.463554505" Oct 11 04:17:41 crc kubenswrapper[4703]: I1011 04:17:41.261867 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-g5vd8"] Oct 11 04:17:41 crc kubenswrapper[4703]: I1011 04:17:41.264403 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-g5vd8" Oct 11 04:17:41 crc kubenswrapper[4703]: I1011 04:17:41.300036 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-g5vd8"] Oct 11 04:17:41 crc kubenswrapper[4703]: I1011 04:17:41.383025 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d84b4e24-da6c-4eeb-8e60-8b9faf42a826-utilities\") pod \"redhat-operators-g5vd8\" (UID: \"d84b4e24-da6c-4eeb-8e60-8b9faf42a826\") " pod="openshift-marketplace/redhat-operators-g5vd8" Oct 11 04:17:41 crc kubenswrapper[4703]: I1011 04:17:41.383332 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qfhw4\" (UniqueName: \"kubernetes.io/projected/d84b4e24-da6c-4eeb-8e60-8b9faf42a826-kube-api-access-qfhw4\") pod \"redhat-operators-g5vd8\" (UID: \"d84b4e24-da6c-4eeb-8e60-8b9faf42a826\") " pod="openshift-marketplace/redhat-operators-g5vd8" Oct 11 04:17:41 crc kubenswrapper[4703]: I1011 04:17:41.383436 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d84b4e24-da6c-4eeb-8e60-8b9faf42a826-catalog-content\") pod \"redhat-operators-g5vd8\" (UID: \"d84b4e24-da6c-4eeb-8e60-8b9faf42a826\") " pod="openshift-marketplace/redhat-operators-g5vd8" Oct 11 04:17:41 crc kubenswrapper[4703]: I1011 04:17:41.485376 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d84b4e24-da6c-4eeb-8e60-8b9faf42a826-utilities\") pod \"redhat-operators-g5vd8\" (UID: \"d84b4e24-da6c-4eeb-8e60-8b9faf42a826\") " pod="openshift-marketplace/redhat-operators-g5vd8" Oct 11 04:17:41 crc kubenswrapper[4703]: I1011 04:17:41.485538 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qfhw4\" (UniqueName: \"kubernetes.io/projected/d84b4e24-da6c-4eeb-8e60-8b9faf42a826-kube-api-access-qfhw4\") pod \"redhat-operators-g5vd8\" (UID: \"d84b4e24-da6c-4eeb-8e60-8b9faf42a826\") " pod="openshift-marketplace/redhat-operators-g5vd8" Oct 11 04:17:41 crc kubenswrapper[4703]: I1011 04:17:41.485574 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d84b4e24-da6c-4eeb-8e60-8b9faf42a826-catalog-content\") pod \"redhat-operators-g5vd8\" (UID: \"d84b4e24-da6c-4eeb-8e60-8b9faf42a826\") " pod="openshift-marketplace/redhat-operators-g5vd8" Oct 11 04:17:41 crc kubenswrapper[4703]: I1011 04:17:41.486208 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d84b4e24-da6c-4eeb-8e60-8b9faf42a826-utilities\") pod \"redhat-operators-g5vd8\" (UID: \"d84b4e24-da6c-4eeb-8e60-8b9faf42a826\") " pod="openshift-marketplace/redhat-operators-g5vd8" Oct 11 04:17:41 crc kubenswrapper[4703]: I1011 04:17:41.486230 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d84b4e24-da6c-4eeb-8e60-8b9faf42a826-catalog-content\") pod \"redhat-operators-g5vd8\" (UID: \"d84b4e24-da6c-4eeb-8e60-8b9faf42a826\") " pod="openshift-marketplace/redhat-operators-g5vd8" Oct 11 04:17:41 crc kubenswrapper[4703]: I1011 04:17:41.507332 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qfhw4\" (UniqueName: \"kubernetes.io/projected/d84b4e24-da6c-4eeb-8e60-8b9faf42a826-kube-api-access-qfhw4\") pod \"redhat-operators-g5vd8\" (UID: \"d84b4e24-da6c-4eeb-8e60-8b9faf42a826\") " pod="openshift-marketplace/redhat-operators-g5vd8" Oct 11 04:17:41 crc kubenswrapper[4703]: I1011 04:17:41.632277 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-g5vd8" Oct 11 04:17:42 crc kubenswrapper[4703]: I1011 04:17:42.074528 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-g5vd8"] Oct 11 04:17:42 crc kubenswrapper[4703]: I1011 04:17:42.086664 4703 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-n7wfl" Oct 11 04:17:42 crc kubenswrapper[4703]: I1011 04:17:42.087681 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-n7wfl" Oct 11 04:17:42 crc kubenswrapper[4703]: I1011 04:17:42.161807 4703 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-n7wfl" Oct 11 04:17:42 crc kubenswrapper[4703]: I1011 04:17:42.527212 4703 generic.go:334] "Generic (PLEG): container finished" podID="d84b4e24-da6c-4eeb-8e60-8b9faf42a826" containerID="12cf6f3659dda98fbaa7991765aab9c51ff8e3509fefbd9e9c5004bb23d74f7a" exitCode=0 Oct 11 04:17:42 crc kubenswrapper[4703]: I1011 04:17:42.528220 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g5vd8" event={"ID":"d84b4e24-da6c-4eeb-8e60-8b9faf42a826","Type":"ContainerDied","Data":"12cf6f3659dda98fbaa7991765aab9c51ff8e3509fefbd9e9c5004bb23d74f7a"} Oct 11 04:17:42 crc kubenswrapper[4703]: I1011 04:17:42.528270 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g5vd8" event={"ID":"d84b4e24-da6c-4eeb-8e60-8b9faf42a826","Type":"ContainerStarted","Data":"9f9eabd1907ef5cf9b6a0dcd912c85f395f23b85ea84fb24c99dbfe799001bfd"} Oct 11 04:17:42 crc kubenswrapper[4703]: I1011 04:17:42.566068 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-n7wfl" Oct 11 04:17:43 crc kubenswrapper[4703]: I1011 04:17:43.574172 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g5vd8" event={"ID":"d84b4e24-da6c-4eeb-8e60-8b9faf42a826","Type":"ContainerStarted","Data":"cd6b523477e0b70ab47a3cba177b4f7d7af2a9213593ad83719f147bffb52bc1"} Oct 11 04:17:44 crc kubenswrapper[4703]: I1011 04:17:44.423949 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-n7wfl"] Oct 11 04:17:44 crc kubenswrapper[4703]: I1011 04:17:44.563583 4703 generic.go:334] "Generic (PLEG): container finished" podID="d84b4e24-da6c-4eeb-8e60-8b9faf42a826" containerID="cd6b523477e0b70ab47a3cba177b4f7d7af2a9213593ad83719f147bffb52bc1" exitCode=0 Oct 11 04:17:44 crc kubenswrapper[4703]: I1011 04:17:44.565576 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g5vd8" event={"ID":"d84b4e24-da6c-4eeb-8e60-8b9faf42a826","Type":"ContainerDied","Data":"cd6b523477e0b70ab47a3cba177b4f7d7af2a9213593ad83719f147bffb52bc1"} Oct 11 04:17:45 crc kubenswrapper[4703]: I1011 04:17:45.571842 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g5vd8" event={"ID":"d84b4e24-da6c-4eeb-8e60-8b9faf42a826","Type":"ContainerStarted","Data":"da6c48c404911df6892d973fa7b0d2736b67b46560f6fbf99f450d3fa1557456"} Oct 11 04:17:45 crc kubenswrapper[4703]: I1011 04:17:45.572077 4703 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-n7wfl" podUID="25288c8b-aea6-4d3b-bb66-5103389f442a" containerName="registry-server" containerID="cri-o://a429889a8376563f0f73063a174c8f3db3e11590ec0f9da261c74c87bc9f41ee" gracePeriod=2 Oct 11 04:17:45 crc kubenswrapper[4703]: I1011 04:17:45.600554 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-g5vd8" podStartSLOduration=2.147420756 podStartE2EDuration="4.600536336s" podCreationTimestamp="2025-10-11 04:17:41 +0000 UTC" firstStartedPulling="2025-10-11 04:17:42.528676089 +0000 UTC m=+1353.739158001" lastFinishedPulling="2025-10-11 04:17:44.981791639 +0000 UTC m=+1356.192273581" observedRunningTime="2025-10-11 04:17:45.598085552 +0000 UTC m=+1356.808567494" watchObservedRunningTime="2025-10-11 04:17:45.600536336 +0000 UTC m=+1356.811018258" Oct 11 04:17:45 crc kubenswrapper[4703]: I1011 04:17:45.986792 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-n7wfl" Oct 11 04:17:46 crc kubenswrapper[4703]: I1011 04:17:46.067398 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-grmsx\" (UniqueName: \"kubernetes.io/projected/25288c8b-aea6-4d3b-bb66-5103389f442a-kube-api-access-grmsx\") pod \"25288c8b-aea6-4d3b-bb66-5103389f442a\" (UID: \"25288c8b-aea6-4d3b-bb66-5103389f442a\") " Oct 11 04:17:46 crc kubenswrapper[4703]: I1011 04:17:46.067446 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/25288c8b-aea6-4d3b-bb66-5103389f442a-catalog-content\") pod \"25288c8b-aea6-4d3b-bb66-5103389f442a\" (UID: \"25288c8b-aea6-4d3b-bb66-5103389f442a\") " Oct 11 04:17:46 crc kubenswrapper[4703]: I1011 04:17:46.067527 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/25288c8b-aea6-4d3b-bb66-5103389f442a-utilities\") pod \"25288c8b-aea6-4d3b-bb66-5103389f442a\" (UID: \"25288c8b-aea6-4d3b-bb66-5103389f442a\") " Oct 11 04:17:46 crc kubenswrapper[4703]: I1011 04:17:46.068546 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/25288c8b-aea6-4d3b-bb66-5103389f442a-utilities" (OuterVolumeSpecName: "utilities") pod "25288c8b-aea6-4d3b-bb66-5103389f442a" (UID: "25288c8b-aea6-4d3b-bb66-5103389f442a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 04:17:46 crc kubenswrapper[4703]: I1011 04:17:46.079746 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25288c8b-aea6-4d3b-bb66-5103389f442a-kube-api-access-grmsx" (OuterVolumeSpecName: "kube-api-access-grmsx") pod "25288c8b-aea6-4d3b-bb66-5103389f442a" (UID: "25288c8b-aea6-4d3b-bb66-5103389f442a"). InnerVolumeSpecName "kube-api-access-grmsx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 04:17:46 crc kubenswrapper[4703]: I1011 04:17:46.109754 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/25288c8b-aea6-4d3b-bb66-5103389f442a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "25288c8b-aea6-4d3b-bb66-5103389f442a" (UID: "25288c8b-aea6-4d3b-bb66-5103389f442a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 04:17:46 crc kubenswrapper[4703]: I1011 04:17:46.169862 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-grmsx\" (UniqueName: \"kubernetes.io/projected/25288c8b-aea6-4d3b-bb66-5103389f442a-kube-api-access-grmsx\") on node \"crc\" DevicePath \"\"" Oct 11 04:17:46 crc kubenswrapper[4703]: I1011 04:17:46.169898 4703 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/25288c8b-aea6-4d3b-bb66-5103389f442a-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 11 04:17:46 crc kubenswrapper[4703]: I1011 04:17:46.169907 4703 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/25288c8b-aea6-4d3b-bb66-5103389f442a-utilities\") on node \"crc\" DevicePath \"\"" Oct 11 04:17:46 crc kubenswrapper[4703]: I1011 04:17:46.583379 4703 generic.go:334] "Generic (PLEG): container finished" podID="25288c8b-aea6-4d3b-bb66-5103389f442a" containerID="a429889a8376563f0f73063a174c8f3db3e11590ec0f9da261c74c87bc9f41ee" exitCode=0 Oct 11 04:17:46 crc kubenswrapper[4703]: I1011 04:17:46.583433 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-n7wfl" event={"ID":"25288c8b-aea6-4d3b-bb66-5103389f442a","Type":"ContainerDied","Data":"a429889a8376563f0f73063a174c8f3db3e11590ec0f9da261c74c87bc9f41ee"} Oct 11 04:17:46 crc kubenswrapper[4703]: I1011 04:17:46.583517 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-n7wfl" event={"ID":"25288c8b-aea6-4d3b-bb66-5103389f442a","Type":"ContainerDied","Data":"332d3672da9fdd73474ca32b4586c28c56f9b5eb25894a0d7dafe3617a60f084"} Oct 11 04:17:46 crc kubenswrapper[4703]: I1011 04:17:46.583457 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-n7wfl" Oct 11 04:17:46 crc kubenswrapper[4703]: I1011 04:17:46.583552 4703 scope.go:117] "RemoveContainer" containerID="a429889a8376563f0f73063a174c8f3db3e11590ec0f9da261c74c87bc9f41ee" Oct 11 04:17:46 crc kubenswrapper[4703]: I1011 04:17:46.603416 4703 scope.go:117] "RemoveContainer" containerID="57a8876013aed102b27399915db8d8cf3872193001f69a43aa4c0c4078d148a1" Oct 11 04:17:46 crc kubenswrapper[4703]: I1011 04:17:46.619834 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-n7wfl"] Oct 11 04:17:46 crc kubenswrapper[4703]: I1011 04:17:46.627674 4703 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-n7wfl"] Oct 11 04:17:46 crc kubenswrapper[4703]: I1011 04:17:46.646399 4703 scope.go:117] "RemoveContainer" containerID="4109afcd21d3d4464e274cf21601bb30aabb4468b661f2e4df34406429485988" Oct 11 04:17:46 crc kubenswrapper[4703]: I1011 04:17:46.663097 4703 scope.go:117] "RemoveContainer" containerID="a429889a8376563f0f73063a174c8f3db3e11590ec0f9da261c74c87bc9f41ee" Oct 11 04:17:46 crc kubenswrapper[4703]: E1011 04:17:46.663535 4703 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a429889a8376563f0f73063a174c8f3db3e11590ec0f9da261c74c87bc9f41ee\": container with ID starting with a429889a8376563f0f73063a174c8f3db3e11590ec0f9da261c74c87bc9f41ee not found: ID does not exist" containerID="a429889a8376563f0f73063a174c8f3db3e11590ec0f9da261c74c87bc9f41ee" Oct 11 04:17:46 crc kubenswrapper[4703]: I1011 04:17:46.663584 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a429889a8376563f0f73063a174c8f3db3e11590ec0f9da261c74c87bc9f41ee"} err="failed to get container status \"a429889a8376563f0f73063a174c8f3db3e11590ec0f9da261c74c87bc9f41ee\": rpc error: code = NotFound desc = could not find container \"a429889a8376563f0f73063a174c8f3db3e11590ec0f9da261c74c87bc9f41ee\": container with ID starting with a429889a8376563f0f73063a174c8f3db3e11590ec0f9da261c74c87bc9f41ee not found: ID does not exist" Oct 11 04:17:46 crc kubenswrapper[4703]: I1011 04:17:46.663644 4703 scope.go:117] "RemoveContainer" containerID="57a8876013aed102b27399915db8d8cf3872193001f69a43aa4c0c4078d148a1" Oct 11 04:17:46 crc kubenswrapper[4703]: E1011 04:17:46.663978 4703 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"57a8876013aed102b27399915db8d8cf3872193001f69a43aa4c0c4078d148a1\": container with ID starting with 57a8876013aed102b27399915db8d8cf3872193001f69a43aa4c0c4078d148a1 not found: ID does not exist" containerID="57a8876013aed102b27399915db8d8cf3872193001f69a43aa4c0c4078d148a1" Oct 11 04:17:46 crc kubenswrapper[4703]: I1011 04:17:46.664002 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"57a8876013aed102b27399915db8d8cf3872193001f69a43aa4c0c4078d148a1"} err="failed to get container status \"57a8876013aed102b27399915db8d8cf3872193001f69a43aa4c0c4078d148a1\": rpc error: code = NotFound desc = could not find container \"57a8876013aed102b27399915db8d8cf3872193001f69a43aa4c0c4078d148a1\": container with ID starting with 57a8876013aed102b27399915db8d8cf3872193001f69a43aa4c0c4078d148a1 not found: ID does not exist" Oct 11 04:17:46 crc kubenswrapper[4703]: I1011 04:17:46.664014 4703 scope.go:117] "RemoveContainer" containerID="4109afcd21d3d4464e274cf21601bb30aabb4468b661f2e4df34406429485988" Oct 11 04:17:46 crc kubenswrapper[4703]: E1011 04:17:46.664505 4703 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4109afcd21d3d4464e274cf21601bb30aabb4468b661f2e4df34406429485988\": container with ID starting with 4109afcd21d3d4464e274cf21601bb30aabb4468b661f2e4df34406429485988 not found: ID does not exist" containerID="4109afcd21d3d4464e274cf21601bb30aabb4468b661f2e4df34406429485988" Oct 11 04:17:46 crc kubenswrapper[4703]: I1011 04:17:46.664555 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4109afcd21d3d4464e274cf21601bb30aabb4468b661f2e4df34406429485988"} err="failed to get container status \"4109afcd21d3d4464e274cf21601bb30aabb4468b661f2e4df34406429485988\": rpc error: code = NotFound desc = could not find container \"4109afcd21d3d4464e274cf21601bb30aabb4468b661f2e4df34406429485988\": container with ID starting with 4109afcd21d3d4464e274cf21601bb30aabb4468b661f2e4df34406429485988 not found: ID does not exist" Oct 11 04:17:47 crc kubenswrapper[4703]: I1011 04:17:47.565371 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25288c8b-aea6-4d3b-bb66-5103389f442a" path="/var/lib/kubelet/pods/25288c8b-aea6-4d3b-bb66-5103389f442a/volumes" Oct 11 04:17:50 crc kubenswrapper[4703]: I1011 04:17:50.255632 4703 patch_prober.go:28] interesting pod/machine-config-daemon-6b7d5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 11 04:17:50 crc kubenswrapper[4703]: I1011 04:17:50.255696 4703 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6b7d5" podUID="74f832fc-6791-47d6-a9b3-07d923e053dc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 11 04:17:50 crc kubenswrapper[4703]: I1011 04:17:50.255749 4703 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-6b7d5" Oct 11 04:17:50 crc kubenswrapper[4703]: I1011 04:17:50.256482 4703 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"47760bd5e163e98f5c9c154e6e147294e18b376d4efe7d50aef86addf0f66969"} pod="openshift-machine-config-operator/machine-config-daemon-6b7d5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 11 04:17:50 crc kubenswrapper[4703]: I1011 04:17:50.256551 4703 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-6b7d5" podUID="74f832fc-6791-47d6-a9b3-07d923e053dc" containerName="machine-config-daemon" containerID="cri-o://47760bd5e163e98f5c9c154e6e147294e18b376d4efe7d50aef86addf0f66969" gracePeriod=600 Oct 11 04:17:50 crc kubenswrapper[4703]: I1011 04:17:50.658881 4703 generic.go:334] "Generic (PLEG): container finished" podID="74f832fc-6791-47d6-a9b3-07d923e053dc" containerID="47760bd5e163e98f5c9c154e6e147294e18b376d4efe7d50aef86addf0f66969" exitCode=0 Oct 11 04:17:50 crc kubenswrapper[4703]: I1011 04:17:50.658935 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6b7d5" event={"ID":"74f832fc-6791-47d6-a9b3-07d923e053dc","Type":"ContainerDied","Data":"47760bd5e163e98f5c9c154e6e147294e18b376d4efe7d50aef86addf0f66969"} Oct 11 04:17:50 crc kubenswrapper[4703]: I1011 04:17:50.659301 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6b7d5" event={"ID":"74f832fc-6791-47d6-a9b3-07d923e053dc","Type":"ContainerStarted","Data":"93574af0cf2738114bd5085b74182463f92f2f4ce3cf1ce5560bcf2bff6e2eae"} Oct 11 04:17:50 crc kubenswrapper[4703]: I1011 04:17:50.659331 4703 scope.go:117] "RemoveContainer" containerID="399509657f5ec21d3ece8fb671d017bd6349c4dadb4c055bb8e50807bdb71cf9" Oct 11 04:17:51 crc kubenswrapper[4703]: I1011 04:17:51.632407 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-g5vd8" Oct 11 04:17:51 crc kubenswrapper[4703]: I1011 04:17:51.632984 4703 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-g5vd8" Oct 11 04:17:51 crc kubenswrapper[4703]: I1011 04:17:51.715421 4703 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-g5vd8" Oct 11 04:17:51 crc kubenswrapper[4703]: I1011 04:17:51.804295 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-g5vd8" Oct 11 04:17:52 crc kubenswrapper[4703]: I1011 04:17:52.819510 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-g5vd8"] Oct 11 04:17:53 crc kubenswrapper[4703]: I1011 04:17:53.690133 4703 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-g5vd8" podUID="d84b4e24-da6c-4eeb-8e60-8b9faf42a826" containerName="registry-server" containerID="cri-o://da6c48c404911df6892d973fa7b0d2736b67b46560f6fbf99f450d3fa1557456" gracePeriod=2 Oct 11 04:17:54 crc kubenswrapper[4703]: I1011 04:17:54.130560 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-g5vd8" Oct 11 04:17:54 crc kubenswrapper[4703]: I1011 04:17:54.300136 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d84b4e24-da6c-4eeb-8e60-8b9faf42a826-catalog-content\") pod \"d84b4e24-da6c-4eeb-8e60-8b9faf42a826\" (UID: \"d84b4e24-da6c-4eeb-8e60-8b9faf42a826\") " Oct 11 04:17:54 crc kubenswrapper[4703]: I1011 04:17:54.300287 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qfhw4\" (UniqueName: \"kubernetes.io/projected/d84b4e24-da6c-4eeb-8e60-8b9faf42a826-kube-api-access-qfhw4\") pod \"d84b4e24-da6c-4eeb-8e60-8b9faf42a826\" (UID: \"d84b4e24-da6c-4eeb-8e60-8b9faf42a826\") " Oct 11 04:17:54 crc kubenswrapper[4703]: I1011 04:17:54.300436 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d84b4e24-da6c-4eeb-8e60-8b9faf42a826-utilities\") pod \"d84b4e24-da6c-4eeb-8e60-8b9faf42a826\" (UID: \"d84b4e24-da6c-4eeb-8e60-8b9faf42a826\") " Oct 11 04:17:54 crc kubenswrapper[4703]: I1011 04:17:54.301359 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d84b4e24-da6c-4eeb-8e60-8b9faf42a826-utilities" (OuterVolumeSpecName: "utilities") pod "d84b4e24-da6c-4eeb-8e60-8b9faf42a826" (UID: "d84b4e24-da6c-4eeb-8e60-8b9faf42a826"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 04:17:54 crc kubenswrapper[4703]: I1011 04:17:54.306914 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d84b4e24-da6c-4eeb-8e60-8b9faf42a826-kube-api-access-qfhw4" (OuterVolumeSpecName: "kube-api-access-qfhw4") pod "d84b4e24-da6c-4eeb-8e60-8b9faf42a826" (UID: "d84b4e24-da6c-4eeb-8e60-8b9faf42a826"). InnerVolumeSpecName "kube-api-access-qfhw4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 04:17:54 crc kubenswrapper[4703]: I1011 04:17:54.402828 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qfhw4\" (UniqueName: \"kubernetes.io/projected/d84b4e24-da6c-4eeb-8e60-8b9faf42a826-kube-api-access-qfhw4\") on node \"crc\" DevicePath \"\"" Oct 11 04:17:54 crc kubenswrapper[4703]: I1011 04:17:54.402894 4703 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d84b4e24-da6c-4eeb-8e60-8b9faf42a826-utilities\") on node \"crc\" DevicePath \"\"" Oct 11 04:17:54 crc kubenswrapper[4703]: I1011 04:17:54.701184 4703 generic.go:334] "Generic (PLEG): container finished" podID="d84b4e24-da6c-4eeb-8e60-8b9faf42a826" containerID="da6c48c404911df6892d973fa7b0d2736b67b46560f6fbf99f450d3fa1557456" exitCode=0 Oct 11 04:17:54 crc kubenswrapper[4703]: I1011 04:17:54.701231 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g5vd8" event={"ID":"d84b4e24-da6c-4eeb-8e60-8b9faf42a826","Type":"ContainerDied","Data":"da6c48c404911df6892d973fa7b0d2736b67b46560f6fbf99f450d3fa1557456"} Oct 11 04:17:54 crc kubenswrapper[4703]: I1011 04:17:54.701263 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g5vd8" event={"ID":"d84b4e24-da6c-4eeb-8e60-8b9faf42a826","Type":"ContainerDied","Data":"9f9eabd1907ef5cf9b6a0dcd912c85f395f23b85ea84fb24c99dbfe799001bfd"} Oct 11 04:17:54 crc kubenswrapper[4703]: I1011 04:17:54.701283 4703 scope.go:117] "RemoveContainer" containerID="da6c48c404911df6892d973fa7b0d2736b67b46560f6fbf99f450d3fa1557456" Oct 11 04:17:54 crc kubenswrapper[4703]: I1011 04:17:54.701428 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-g5vd8" Oct 11 04:17:54 crc kubenswrapper[4703]: I1011 04:17:54.734358 4703 scope.go:117] "RemoveContainer" containerID="cd6b523477e0b70ab47a3cba177b4f7d7af2a9213593ad83719f147bffb52bc1" Oct 11 04:17:54 crc kubenswrapper[4703]: I1011 04:17:54.765171 4703 scope.go:117] "RemoveContainer" containerID="12cf6f3659dda98fbaa7991765aab9c51ff8e3509fefbd9e9c5004bb23d74f7a" Oct 11 04:17:54 crc kubenswrapper[4703]: I1011 04:17:54.811829 4703 scope.go:117] "RemoveContainer" containerID="da6c48c404911df6892d973fa7b0d2736b67b46560f6fbf99f450d3fa1557456" Oct 11 04:17:54 crc kubenswrapper[4703]: E1011 04:17:54.812839 4703 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"da6c48c404911df6892d973fa7b0d2736b67b46560f6fbf99f450d3fa1557456\": container with ID starting with da6c48c404911df6892d973fa7b0d2736b67b46560f6fbf99f450d3fa1557456 not found: ID does not exist" containerID="da6c48c404911df6892d973fa7b0d2736b67b46560f6fbf99f450d3fa1557456" Oct 11 04:17:54 crc kubenswrapper[4703]: I1011 04:17:54.812906 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da6c48c404911df6892d973fa7b0d2736b67b46560f6fbf99f450d3fa1557456"} err="failed to get container status \"da6c48c404911df6892d973fa7b0d2736b67b46560f6fbf99f450d3fa1557456\": rpc error: code = NotFound desc = could not find container \"da6c48c404911df6892d973fa7b0d2736b67b46560f6fbf99f450d3fa1557456\": container with ID starting with da6c48c404911df6892d973fa7b0d2736b67b46560f6fbf99f450d3fa1557456 not found: ID does not exist" Oct 11 04:17:54 crc kubenswrapper[4703]: I1011 04:17:54.812948 4703 scope.go:117] "RemoveContainer" containerID="cd6b523477e0b70ab47a3cba177b4f7d7af2a9213593ad83719f147bffb52bc1" Oct 11 04:17:54 crc kubenswrapper[4703]: E1011 04:17:54.813618 4703 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cd6b523477e0b70ab47a3cba177b4f7d7af2a9213593ad83719f147bffb52bc1\": container with ID starting with cd6b523477e0b70ab47a3cba177b4f7d7af2a9213593ad83719f147bffb52bc1 not found: ID does not exist" containerID="cd6b523477e0b70ab47a3cba177b4f7d7af2a9213593ad83719f147bffb52bc1" Oct 11 04:17:54 crc kubenswrapper[4703]: I1011 04:17:54.813683 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd6b523477e0b70ab47a3cba177b4f7d7af2a9213593ad83719f147bffb52bc1"} err="failed to get container status \"cd6b523477e0b70ab47a3cba177b4f7d7af2a9213593ad83719f147bffb52bc1\": rpc error: code = NotFound desc = could not find container \"cd6b523477e0b70ab47a3cba177b4f7d7af2a9213593ad83719f147bffb52bc1\": container with ID starting with cd6b523477e0b70ab47a3cba177b4f7d7af2a9213593ad83719f147bffb52bc1 not found: ID does not exist" Oct 11 04:17:54 crc kubenswrapper[4703]: I1011 04:17:54.813730 4703 scope.go:117] "RemoveContainer" containerID="12cf6f3659dda98fbaa7991765aab9c51ff8e3509fefbd9e9c5004bb23d74f7a" Oct 11 04:17:54 crc kubenswrapper[4703]: E1011 04:17:54.814354 4703 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"12cf6f3659dda98fbaa7991765aab9c51ff8e3509fefbd9e9c5004bb23d74f7a\": container with ID starting with 12cf6f3659dda98fbaa7991765aab9c51ff8e3509fefbd9e9c5004bb23d74f7a not found: ID does not exist" containerID="12cf6f3659dda98fbaa7991765aab9c51ff8e3509fefbd9e9c5004bb23d74f7a" Oct 11 04:17:54 crc kubenswrapper[4703]: I1011 04:17:54.814391 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"12cf6f3659dda98fbaa7991765aab9c51ff8e3509fefbd9e9c5004bb23d74f7a"} err="failed to get container status \"12cf6f3659dda98fbaa7991765aab9c51ff8e3509fefbd9e9c5004bb23d74f7a\": rpc error: code = NotFound desc = could not find container \"12cf6f3659dda98fbaa7991765aab9c51ff8e3509fefbd9e9c5004bb23d74f7a\": container with ID starting with 12cf6f3659dda98fbaa7991765aab9c51ff8e3509fefbd9e9c5004bb23d74f7a not found: ID does not exist" Oct 11 04:17:55 crc kubenswrapper[4703]: I1011 04:17:55.782576 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d84b4e24-da6c-4eeb-8e60-8b9faf42a826-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d84b4e24-da6c-4eeb-8e60-8b9faf42a826" (UID: "d84b4e24-da6c-4eeb-8e60-8b9faf42a826"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 04:17:55 crc kubenswrapper[4703]: I1011 04:17:55.828740 4703 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d84b4e24-da6c-4eeb-8e60-8b9faf42a826-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 11 04:17:55 crc kubenswrapper[4703]: I1011 04:17:55.937530 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-g5vd8"] Oct 11 04:17:55 crc kubenswrapper[4703]: I1011 04:17:55.942238 4703 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-g5vd8"] Oct 11 04:17:57 crc kubenswrapper[4703]: I1011 04:17:57.552121 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d84b4e24-da6c-4eeb-8e60-8b9faf42a826" path="/var/lib/kubelet/pods/d84b4e24-da6c-4eeb-8e60-8b9faf42a826/volumes" Oct 11 04:17:59 crc kubenswrapper[4703]: I1011 04:17:59.400046 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-fsnl5"] Oct 11 04:17:59 crc kubenswrapper[4703]: E1011 04:17:59.400450 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d84b4e24-da6c-4eeb-8e60-8b9faf42a826" containerName="extract-utilities" Oct 11 04:17:59 crc kubenswrapper[4703]: I1011 04:17:59.400493 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="d84b4e24-da6c-4eeb-8e60-8b9faf42a826" containerName="extract-utilities" Oct 11 04:17:59 crc kubenswrapper[4703]: E1011 04:17:59.400533 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25288c8b-aea6-4d3b-bb66-5103389f442a" containerName="registry-server" Oct 11 04:17:59 crc kubenswrapper[4703]: I1011 04:17:59.400546 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="25288c8b-aea6-4d3b-bb66-5103389f442a" containerName="registry-server" Oct 11 04:17:59 crc kubenswrapper[4703]: E1011 04:17:59.400571 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25288c8b-aea6-4d3b-bb66-5103389f442a" containerName="extract-content" Oct 11 04:17:59 crc kubenswrapper[4703]: I1011 04:17:59.400583 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="25288c8b-aea6-4d3b-bb66-5103389f442a" containerName="extract-content" Oct 11 04:17:59 crc kubenswrapper[4703]: E1011 04:17:59.400641 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d84b4e24-da6c-4eeb-8e60-8b9faf42a826" containerName="extract-content" Oct 11 04:17:59 crc kubenswrapper[4703]: I1011 04:17:59.400652 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="d84b4e24-da6c-4eeb-8e60-8b9faf42a826" containerName="extract-content" Oct 11 04:17:59 crc kubenswrapper[4703]: E1011 04:17:59.400676 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d84b4e24-da6c-4eeb-8e60-8b9faf42a826" containerName="registry-server" Oct 11 04:17:59 crc kubenswrapper[4703]: I1011 04:17:59.400686 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="d84b4e24-da6c-4eeb-8e60-8b9faf42a826" containerName="registry-server" Oct 11 04:17:59 crc kubenswrapper[4703]: E1011 04:17:59.400697 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25288c8b-aea6-4d3b-bb66-5103389f442a" containerName="extract-utilities" Oct 11 04:17:59 crc kubenswrapper[4703]: I1011 04:17:59.400707 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="25288c8b-aea6-4d3b-bb66-5103389f442a" containerName="extract-utilities" Oct 11 04:17:59 crc kubenswrapper[4703]: I1011 04:17:59.400982 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="25288c8b-aea6-4d3b-bb66-5103389f442a" containerName="registry-server" Oct 11 04:17:59 crc kubenswrapper[4703]: I1011 04:17:59.401002 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="d84b4e24-da6c-4eeb-8e60-8b9faf42a826" containerName="registry-server" Oct 11 04:17:59 crc kubenswrapper[4703]: I1011 04:17:59.402686 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fsnl5" Oct 11 04:17:59 crc kubenswrapper[4703]: I1011 04:17:59.423807 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-fsnl5"] Oct 11 04:17:59 crc kubenswrapper[4703]: I1011 04:17:59.492728 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f049b580-57a0-4a60-8229-0c5501ee95ca-catalog-content\") pod \"redhat-marketplace-fsnl5\" (UID: \"f049b580-57a0-4a60-8229-0c5501ee95ca\") " pod="openshift-marketplace/redhat-marketplace-fsnl5" Oct 11 04:17:59 crc kubenswrapper[4703]: I1011 04:17:59.493007 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f049b580-57a0-4a60-8229-0c5501ee95ca-utilities\") pod \"redhat-marketplace-fsnl5\" (UID: \"f049b580-57a0-4a60-8229-0c5501ee95ca\") " pod="openshift-marketplace/redhat-marketplace-fsnl5" Oct 11 04:17:59 crc kubenswrapper[4703]: I1011 04:17:59.493155 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q6gpn\" (UniqueName: \"kubernetes.io/projected/f049b580-57a0-4a60-8229-0c5501ee95ca-kube-api-access-q6gpn\") pod \"redhat-marketplace-fsnl5\" (UID: \"f049b580-57a0-4a60-8229-0c5501ee95ca\") " pod="openshift-marketplace/redhat-marketplace-fsnl5" Oct 11 04:17:59 crc kubenswrapper[4703]: I1011 04:17:59.594192 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q6gpn\" (UniqueName: \"kubernetes.io/projected/f049b580-57a0-4a60-8229-0c5501ee95ca-kube-api-access-q6gpn\") pod \"redhat-marketplace-fsnl5\" (UID: \"f049b580-57a0-4a60-8229-0c5501ee95ca\") " pod="openshift-marketplace/redhat-marketplace-fsnl5" Oct 11 04:17:59 crc kubenswrapper[4703]: I1011 04:17:59.594308 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f049b580-57a0-4a60-8229-0c5501ee95ca-catalog-content\") pod \"redhat-marketplace-fsnl5\" (UID: \"f049b580-57a0-4a60-8229-0c5501ee95ca\") " pod="openshift-marketplace/redhat-marketplace-fsnl5" Oct 11 04:17:59 crc kubenswrapper[4703]: I1011 04:17:59.594337 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f049b580-57a0-4a60-8229-0c5501ee95ca-utilities\") pod \"redhat-marketplace-fsnl5\" (UID: \"f049b580-57a0-4a60-8229-0c5501ee95ca\") " pod="openshift-marketplace/redhat-marketplace-fsnl5" Oct 11 04:17:59 crc kubenswrapper[4703]: I1011 04:17:59.594964 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f049b580-57a0-4a60-8229-0c5501ee95ca-utilities\") pod \"redhat-marketplace-fsnl5\" (UID: \"f049b580-57a0-4a60-8229-0c5501ee95ca\") " pod="openshift-marketplace/redhat-marketplace-fsnl5" Oct 11 04:17:59 crc kubenswrapper[4703]: I1011 04:17:59.594964 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f049b580-57a0-4a60-8229-0c5501ee95ca-catalog-content\") pod \"redhat-marketplace-fsnl5\" (UID: \"f049b580-57a0-4a60-8229-0c5501ee95ca\") " pod="openshift-marketplace/redhat-marketplace-fsnl5" Oct 11 04:17:59 crc kubenswrapper[4703]: I1011 04:17:59.618584 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q6gpn\" (UniqueName: \"kubernetes.io/projected/f049b580-57a0-4a60-8229-0c5501ee95ca-kube-api-access-q6gpn\") pod \"redhat-marketplace-fsnl5\" (UID: \"f049b580-57a0-4a60-8229-0c5501ee95ca\") " pod="openshift-marketplace/redhat-marketplace-fsnl5" Oct 11 04:17:59 crc kubenswrapper[4703]: I1011 04:17:59.770057 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fsnl5" Oct 11 04:18:00 crc kubenswrapper[4703]: I1011 04:18:00.018921 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-fsnl5"] Oct 11 04:18:00 crc kubenswrapper[4703]: W1011 04:18:00.031831 4703 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf049b580_57a0_4a60_8229_0c5501ee95ca.slice/crio-cf067f5426bfea36853d29ee4afb855fb4930e0499d772fb2bb6505ae68b0183 WatchSource:0}: Error finding container cf067f5426bfea36853d29ee4afb855fb4930e0499d772fb2bb6505ae68b0183: Status 404 returned error can't find the container with id cf067f5426bfea36853d29ee4afb855fb4930e0499d772fb2bb6505ae68b0183 Oct 11 04:18:00 crc kubenswrapper[4703]: I1011 04:18:00.775269 4703 generic.go:334] "Generic (PLEG): container finished" podID="f049b580-57a0-4a60-8229-0c5501ee95ca" containerID="6f3b72623c3a898bd9d37e8c4dc810c8ffcc2b8f8f0c534e97001335d2328749" exitCode=0 Oct 11 04:18:00 crc kubenswrapper[4703]: I1011 04:18:00.775362 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fsnl5" event={"ID":"f049b580-57a0-4a60-8229-0c5501ee95ca","Type":"ContainerDied","Data":"6f3b72623c3a898bd9d37e8c4dc810c8ffcc2b8f8f0c534e97001335d2328749"} Oct 11 04:18:00 crc kubenswrapper[4703]: I1011 04:18:00.775648 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fsnl5" event={"ID":"f049b580-57a0-4a60-8229-0c5501ee95ca","Type":"ContainerStarted","Data":"cf067f5426bfea36853d29ee4afb855fb4930e0499d772fb2bb6505ae68b0183"} Oct 11 04:18:01 crc kubenswrapper[4703]: I1011 04:18:01.788176 4703 generic.go:334] "Generic (PLEG): container finished" podID="f049b580-57a0-4a60-8229-0c5501ee95ca" containerID="31a905baffba17fe1ffaf21d2b1270d6a60c7b0dea6aa5ec1e0482808a40204f" exitCode=0 Oct 11 04:18:01 crc kubenswrapper[4703]: I1011 04:18:01.788301 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fsnl5" event={"ID":"f049b580-57a0-4a60-8229-0c5501ee95ca","Type":"ContainerDied","Data":"31a905baffba17fe1ffaf21d2b1270d6a60c7b0dea6aa5ec1e0482808a40204f"} Oct 11 04:18:02 crc kubenswrapper[4703]: I1011 04:18:02.799876 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fsnl5" event={"ID":"f049b580-57a0-4a60-8229-0c5501ee95ca","Type":"ContainerStarted","Data":"1e21fe7e823b7dddf54c7c8f4979a65747e4f5a3cf6bcd82377021085e943a59"} Oct 11 04:18:02 crc kubenswrapper[4703]: I1011 04:18:02.822119 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-fsnl5" podStartSLOduration=2.357743335 podStartE2EDuration="3.822101612s" podCreationTimestamp="2025-10-11 04:17:59 +0000 UTC" firstStartedPulling="2025-10-11 04:18:00.778231738 +0000 UTC m=+1371.988713670" lastFinishedPulling="2025-10-11 04:18:02.242590015 +0000 UTC m=+1373.453071947" observedRunningTime="2025-10-11 04:18:02.816348432 +0000 UTC m=+1374.026830374" watchObservedRunningTime="2025-10-11 04:18:02.822101612 +0000 UTC m=+1374.032583544" Oct 11 04:18:05 crc kubenswrapper[4703]: I1011 04:18:05.047199 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/keystone-db-create-xfg4f"] Oct 11 04:18:05 crc kubenswrapper[4703]: I1011 04:18:05.052992 4703 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/keystone-db-create-xfg4f"] Oct 11 04:18:05 crc kubenswrapper[4703]: I1011 04:18:05.551900 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="571abf26-029e-4204-8f2b-63e1bb50bae4" path="/var/lib/kubelet/pods/571abf26-029e-4204-8f2b-63e1bb50bae4/volumes" Oct 11 04:18:09 crc kubenswrapper[4703]: I1011 04:18:09.771216 4703 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-fsnl5" Oct 11 04:18:09 crc kubenswrapper[4703]: I1011 04:18:09.771808 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-fsnl5" Oct 11 04:18:09 crc kubenswrapper[4703]: I1011 04:18:09.817236 4703 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-fsnl5" Oct 11 04:18:09 crc kubenswrapper[4703]: I1011 04:18:09.941983 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-fsnl5" Oct 11 04:18:10 crc kubenswrapper[4703]: I1011 04:18:10.065297 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-fsnl5"] Oct 11 04:18:10 crc kubenswrapper[4703]: I1011 04:18:10.556516 4703 scope.go:117] "RemoveContainer" containerID="135c5993db5db105942b9821e4ff461b64967f201f0295570fe178c5ba0c8cd3" Oct 11 04:18:10 crc kubenswrapper[4703]: I1011 04:18:10.602903 4703 scope.go:117] "RemoveContainer" containerID="e3423e355e537e4e22934b87f1e3d77a685f1ba3e6a5130923a582cf723e90a2" Oct 11 04:18:10 crc kubenswrapper[4703]: I1011 04:18:10.668839 4703 scope.go:117] "RemoveContainer" containerID="bc6efe063baa783056e245f2a02bbb9c8b21b34b03786d991f0426384ae6b188" Oct 11 04:18:10 crc kubenswrapper[4703]: I1011 04:18:10.695181 4703 scope.go:117] "RemoveContainer" containerID="5d69e32ed0904675d392fd4a15d7d761c77f9792b20402c42e2b00b03ef730df" Oct 11 04:18:10 crc kubenswrapper[4703]: I1011 04:18:10.749560 4703 scope.go:117] "RemoveContainer" containerID="5f5a7cbd82719763cf8993d75aea233c3a0d82be78bdbd3108dcda952e541729" Oct 11 04:18:11 crc kubenswrapper[4703]: I1011 04:18:11.897259 4703 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-fsnl5" podUID="f049b580-57a0-4a60-8229-0c5501ee95ca" containerName="registry-server" containerID="cri-o://1e21fe7e823b7dddf54c7c8f4979a65747e4f5a3cf6bcd82377021085e943a59" gracePeriod=2 Oct 11 04:18:12 crc kubenswrapper[4703]: I1011 04:18:12.416280 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fsnl5" Oct 11 04:18:12 crc kubenswrapper[4703]: I1011 04:18:12.522434 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q6gpn\" (UniqueName: \"kubernetes.io/projected/f049b580-57a0-4a60-8229-0c5501ee95ca-kube-api-access-q6gpn\") pod \"f049b580-57a0-4a60-8229-0c5501ee95ca\" (UID: \"f049b580-57a0-4a60-8229-0c5501ee95ca\") " Oct 11 04:18:12 crc kubenswrapper[4703]: I1011 04:18:12.522600 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f049b580-57a0-4a60-8229-0c5501ee95ca-catalog-content\") pod \"f049b580-57a0-4a60-8229-0c5501ee95ca\" (UID: \"f049b580-57a0-4a60-8229-0c5501ee95ca\") " Oct 11 04:18:12 crc kubenswrapper[4703]: I1011 04:18:12.522632 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f049b580-57a0-4a60-8229-0c5501ee95ca-utilities\") pod \"f049b580-57a0-4a60-8229-0c5501ee95ca\" (UID: \"f049b580-57a0-4a60-8229-0c5501ee95ca\") " Oct 11 04:18:12 crc kubenswrapper[4703]: I1011 04:18:12.523287 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f049b580-57a0-4a60-8229-0c5501ee95ca-utilities" (OuterVolumeSpecName: "utilities") pod "f049b580-57a0-4a60-8229-0c5501ee95ca" (UID: "f049b580-57a0-4a60-8229-0c5501ee95ca"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 04:18:12 crc kubenswrapper[4703]: I1011 04:18:12.527920 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f049b580-57a0-4a60-8229-0c5501ee95ca-kube-api-access-q6gpn" (OuterVolumeSpecName: "kube-api-access-q6gpn") pod "f049b580-57a0-4a60-8229-0c5501ee95ca" (UID: "f049b580-57a0-4a60-8229-0c5501ee95ca"). InnerVolumeSpecName "kube-api-access-q6gpn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 04:18:12 crc kubenswrapper[4703]: I1011 04:18:12.549274 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f049b580-57a0-4a60-8229-0c5501ee95ca-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f049b580-57a0-4a60-8229-0c5501ee95ca" (UID: "f049b580-57a0-4a60-8229-0c5501ee95ca"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 04:18:12 crc kubenswrapper[4703]: I1011 04:18:12.623788 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q6gpn\" (UniqueName: \"kubernetes.io/projected/f049b580-57a0-4a60-8229-0c5501ee95ca-kube-api-access-q6gpn\") on node \"crc\" DevicePath \"\"" Oct 11 04:18:12 crc kubenswrapper[4703]: I1011 04:18:12.623819 4703 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f049b580-57a0-4a60-8229-0c5501ee95ca-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 11 04:18:12 crc kubenswrapper[4703]: I1011 04:18:12.623828 4703 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f049b580-57a0-4a60-8229-0c5501ee95ca-utilities\") on node \"crc\" DevicePath \"\"" Oct 11 04:18:12 crc kubenswrapper[4703]: I1011 04:18:12.915533 4703 generic.go:334] "Generic (PLEG): container finished" podID="f049b580-57a0-4a60-8229-0c5501ee95ca" containerID="1e21fe7e823b7dddf54c7c8f4979a65747e4f5a3cf6bcd82377021085e943a59" exitCode=0 Oct 11 04:18:12 crc kubenswrapper[4703]: I1011 04:18:12.915636 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fsnl5" event={"ID":"f049b580-57a0-4a60-8229-0c5501ee95ca","Type":"ContainerDied","Data":"1e21fe7e823b7dddf54c7c8f4979a65747e4f5a3cf6bcd82377021085e943a59"} Oct 11 04:18:12 crc kubenswrapper[4703]: I1011 04:18:12.915711 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fsnl5" Oct 11 04:18:12 crc kubenswrapper[4703]: I1011 04:18:12.915942 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fsnl5" event={"ID":"f049b580-57a0-4a60-8229-0c5501ee95ca","Type":"ContainerDied","Data":"cf067f5426bfea36853d29ee4afb855fb4930e0499d772fb2bb6505ae68b0183"} Oct 11 04:18:12 crc kubenswrapper[4703]: I1011 04:18:12.915974 4703 scope.go:117] "RemoveContainer" containerID="1e21fe7e823b7dddf54c7c8f4979a65747e4f5a3cf6bcd82377021085e943a59" Oct 11 04:18:12 crc kubenswrapper[4703]: I1011 04:18:12.937825 4703 scope.go:117] "RemoveContainer" containerID="31a905baffba17fe1ffaf21d2b1270d6a60c7b0dea6aa5ec1e0482808a40204f" Oct 11 04:18:12 crc kubenswrapper[4703]: I1011 04:18:12.967679 4703 scope.go:117] "RemoveContainer" containerID="6f3b72623c3a898bd9d37e8c4dc810c8ffcc2b8f8f0c534e97001335d2328749" Oct 11 04:18:12 crc kubenswrapper[4703]: I1011 04:18:12.980779 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-fsnl5"] Oct 11 04:18:12 crc kubenswrapper[4703]: I1011 04:18:12.992295 4703 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-fsnl5"] Oct 11 04:18:13 crc kubenswrapper[4703]: I1011 04:18:13.019262 4703 scope.go:117] "RemoveContainer" containerID="1e21fe7e823b7dddf54c7c8f4979a65747e4f5a3cf6bcd82377021085e943a59" Oct 11 04:18:13 crc kubenswrapper[4703]: E1011 04:18:13.020388 4703 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1e21fe7e823b7dddf54c7c8f4979a65747e4f5a3cf6bcd82377021085e943a59\": container with ID starting with 1e21fe7e823b7dddf54c7c8f4979a65747e4f5a3cf6bcd82377021085e943a59 not found: ID does not exist" containerID="1e21fe7e823b7dddf54c7c8f4979a65747e4f5a3cf6bcd82377021085e943a59" Oct 11 04:18:13 crc kubenswrapper[4703]: I1011 04:18:13.020426 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e21fe7e823b7dddf54c7c8f4979a65747e4f5a3cf6bcd82377021085e943a59"} err="failed to get container status \"1e21fe7e823b7dddf54c7c8f4979a65747e4f5a3cf6bcd82377021085e943a59\": rpc error: code = NotFound desc = could not find container \"1e21fe7e823b7dddf54c7c8f4979a65747e4f5a3cf6bcd82377021085e943a59\": container with ID starting with 1e21fe7e823b7dddf54c7c8f4979a65747e4f5a3cf6bcd82377021085e943a59 not found: ID does not exist" Oct 11 04:18:13 crc kubenswrapper[4703]: I1011 04:18:13.020456 4703 scope.go:117] "RemoveContainer" containerID="31a905baffba17fe1ffaf21d2b1270d6a60c7b0dea6aa5ec1e0482808a40204f" Oct 11 04:18:13 crc kubenswrapper[4703]: E1011 04:18:13.020749 4703 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"31a905baffba17fe1ffaf21d2b1270d6a60c7b0dea6aa5ec1e0482808a40204f\": container with ID starting with 31a905baffba17fe1ffaf21d2b1270d6a60c7b0dea6aa5ec1e0482808a40204f not found: ID does not exist" containerID="31a905baffba17fe1ffaf21d2b1270d6a60c7b0dea6aa5ec1e0482808a40204f" Oct 11 04:18:13 crc kubenswrapper[4703]: I1011 04:18:13.020798 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"31a905baffba17fe1ffaf21d2b1270d6a60c7b0dea6aa5ec1e0482808a40204f"} err="failed to get container status \"31a905baffba17fe1ffaf21d2b1270d6a60c7b0dea6aa5ec1e0482808a40204f\": rpc error: code = NotFound desc = could not find container \"31a905baffba17fe1ffaf21d2b1270d6a60c7b0dea6aa5ec1e0482808a40204f\": container with ID starting with 31a905baffba17fe1ffaf21d2b1270d6a60c7b0dea6aa5ec1e0482808a40204f not found: ID does not exist" Oct 11 04:18:13 crc kubenswrapper[4703]: I1011 04:18:13.020816 4703 scope.go:117] "RemoveContainer" containerID="6f3b72623c3a898bd9d37e8c4dc810c8ffcc2b8f8f0c534e97001335d2328749" Oct 11 04:18:13 crc kubenswrapper[4703]: E1011 04:18:13.021201 4703 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6f3b72623c3a898bd9d37e8c4dc810c8ffcc2b8f8f0c534e97001335d2328749\": container with ID starting with 6f3b72623c3a898bd9d37e8c4dc810c8ffcc2b8f8f0c534e97001335d2328749 not found: ID does not exist" containerID="6f3b72623c3a898bd9d37e8c4dc810c8ffcc2b8f8f0c534e97001335d2328749" Oct 11 04:18:13 crc kubenswrapper[4703]: I1011 04:18:13.021266 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6f3b72623c3a898bd9d37e8c4dc810c8ffcc2b8f8f0c534e97001335d2328749"} err="failed to get container status \"6f3b72623c3a898bd9d37e8c4dc810c8ffcc2b8f8f0c534e97001335d2328749\": rpc error: code = NotFound desc = could not find container \"6f3b72623c3a898bd9d37e8c4dc810c8ffcc2b8f8f0c534e97001335d2328749\": container with ID starting with 6f3b72623c3a898bd9d37e8c4dc810c8ffcc2b8f8f0c534e97001335d2328749 not found: ID does not exist" Oct 11 04:18:13 crc kubenswrapper[4703]: I1011 04:18:13.553857 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f049b580-57a0-4a60-8229-0c5501ee95ca" path="/var/lib/kubelet/pods/f049b580-57a0-4a60-8229-0c5501ee95ca/volumes" Oct 11 04:18:15 crc kubenswrapper[4703]: I1011 04:18:15.028507 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/keystone-ec7d-account-create-p9vkj"] Oct 11 04:18:15 crc kubenswrapper[4703]: I1011 04:18:15.036009 4703 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/keystone-ec7d-account-create-p9vkj"] Oct 11 04:18:15 crc kubenswrapper[4703]: I1011 04:18:15.543717 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6d50c0fe-ad27-488e-93e1-e5bb9d6586a6" path="/var/lib/kubelet/pods/6d50c0fe-ad27-488e-93e1-e5bb9d6586a6/volumes" Oct 11 04:18:30 crc kubenswrapper[4703]: I1011 04:18:30.033288 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/keystone-db-sync-bdtz7"] Oct 11 04:18:30 crc kubenswrapper[4703]: I1011 04:18:30.039623 4703 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/keystone-db-sync-bdtz7"] Oct 11 04:18:31 crc kubenswrapper[4703]: I1011 04:18:31.551339 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dd76fb71-0332-453e-b6ee-27d3bcef51c5" path="/var/lib/kubelet/pods/dd76fb71-0332-453e-b6ee-27d3bcef51c5/volumes" Oct 11 04:18:36 crc kubenswrapper[4703]: I1011 04:18:36.051180 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/keystone-bootstrap-cz7j7"] Oct 11 04:18:36 crc kubenswrapper[4703]: I1011 04:18:36.060659 4703 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/keystone-bootstrap-cz7j7"] Oct 11 04:18:37 crc kubenswrapper[4703]: I1011 04:18:37.545648 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7b956b3d-6e86-4bd8-9862-f0a421e4ae20" path="/var/lib/kubelet/pods/7b956b3d-6e86-4bd8-9862-f0a421e4ae20/volumes" Oct 11 04:19:07 crc kubenswrapper[4703]: I1011 04:19:07.051858 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/openstackclient"] Oct 11 04:19:07 crc kubenswrapper[4703]: E1011 04:19:07.053065 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f049b580-57a0-4a60-8229-0c5501ee95ca" containerName="extract-content" Oct 11 04:19:07 crc kubenswrapper[4703]: I1011 04:19:07.053090 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="f049b580-57a0-4a60-8229-0c5501ee95ca" containerName="extract-content" Oct 11 04:19:07 crc kubenswrapper[4703]: E1011 04:19:07.053171 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f049b580-57a0-4a60-8229-0c5501ee95ca" containerName="registry-server" Oct 11 04:19:07 crc kubenswrapper[4703]: I1011 04:19:07.053184 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="f049b580-57a0-4a60-8229-0c5501ee95ca" containerName="registry-server" Oct 11 04:19:07 crc kubenswrapper[4703]: E1011 04:19:07.053252 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f049b580-57a0-4a60-8229-0c5501ee95ca" containerName="extract-utilities" Oct 11 04:19:07 crc kubenswrapper[4703]: I1011 04:19:07.053266 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="f049b580-57a0-4a60-8229-0c5501ee95ca" containerName="extract-utilities" Oct 11 04:19:07 crc kubenswrapper[4703]: I1011 04:19:07.053601 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="f049b580-57a0-4a60-8229-0c5501ee95ca" containerName="registry-server" Oct 11 04:19:07 crc kubenswrapper[4703]: I1011 04:19:07.054601 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/openstackclient" Oct 11 04:19:07 crc kubenswrapper[4703]: I1011 04:19:07.065363 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"glance-kuttl-tests"/"openstack-config" Oct 11 04:19:07 crc kubenswrapper[4703]: I1011 04:19:07.065553 4703 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"default-dockercfg-lrfdz" Oct 11 04:19:07 crc kubenswrapper[4703]: I1011 04:19:07.065553 4703 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"openstack-config-secret" Oct 11 04:19:07 crc kubenswrapper[4703]: I1011 04:19:07.066203 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"glance-kuttl-tests"/"openstack-scripts-9db6gc427h" Oct 11 04:19:07 crc kubenswrapper[4703]: I1011 04:19:07.075680 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/openstackclient"] Oct 11 04:19:07 crc kubenswrapper[4703]: I1011 04:19:07.146534 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zgx68\" (UniqueName: \"kubernetes.io/projected/0d51d993-9626-45d7-b558-34c417afa63a-kube-api-access-zgx68\") pod \"openstackclient\" (UID: \"0d51d993-9626-45d7-b558-34c417afa63a\") " pod="glance-kuttl-tests/openstackclient" Oct 11 04:19:07 crc kubenswrapper[4703]: I1011 04:19:07.146621 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/0d51d993-9626-45d7-b558-34c417afa63a-openstack-config-secret\") pod \"openstackclient\" (UID: \"0d51d993-9626-45d7-b558-34c417afa63a\") " pod="glance-kuttl-tests/openstackclient" Oct 11 04:19:07 crc kubenswrapper[4703]: I1011 04:19:07.146656 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-scripts\" (UniqueName: \"kubernetes.io/configmap/0d51d993-9626-45d7-b558-34c417afa63a-openstack-scripts\") pod \"openstackclient\" (UID: \"0d51d993-9626-45d7-b558-34c417afa63a\") " pod="glance-kuttl-tests/openstackclient" Oct 11 04:19:07 crc kubenswrapper[4703]: I1011 04:19:07.146874 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/0d51d993-9626-45d7-b558-34c417afa63a-openstack-config\") pod \"openstackclient\" (UID: \"0d51d993-9626-45d7-b558-34c417afa63a\") " pod="glance-kuttl-tests/openstackclient" Oct 11 04:19:07 crc kubenswrapper[4703]: I1011 04:19:07.248200 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/0d51d993-9626-45d7-b558-34c417afa63a-openstack-config\") pod \"openstackclient\" (UID: \"0d51d993-9626-45d7-b558-34c417afa63a\") " pod="glance-kuttl-tests/openstackclient" Oct 11 04:19:07 crc kubenswrapper[4703]: I1011 04:19:07.248534 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zgx68\" (UniqueName: \"kubernetes.io/projected/0d51d993-9626-45d7-b558-34c417afa63a-kube-api-access-zgx68\") pod \"openstackclient\" (UID: \"0d51d993-9626-45d7-b558-34c417afa63a\") " pod="glance-kuttl-tests/openstackclient" Oct 11 04:19:07 crc kubenswrapper[4703]: I1011 04:19:07.248679 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/0d51d993-9626-45d7-b558-34c417afa63a-openstack-config-secret\") pod \"openstackclient\" (UID: \"0d51d993-9626-45d7-b558-34c417afa63a\") " pod="glance-kuttl-tests/openstackclient" Oct 11 04:19:07 crc kubenswrapper[4703]: I1011 04:19:07.248784 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-scripts\" (UniqueName: \"kubernetes.io/configmap/0d51d993-9626-45d7-b558-34c417afa63a-openstack-scripts\") pod \"openstackclient\" (UID: \"0d51d993-9626-45d7-b558-34c417afa63a\") " pod="glance-kuttl-tests/openstackclient" Oct 11 04:19:07 crc kubenswrapper[4703]: I1011 04:19:07.249408 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/0d51d993-9626-45d7-b558-34c417afa63a-openstack-config\") pod \"openstackclient\" (UID: \"0d51d993-9626-45d7-b558-34c417afa63a\") " pod="glance-kuttl-tests/openstackclient" Oct 11 04:19:07 crc kubenswrapper[4703]: I1011 04:19:07.250264 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-scripts\" (UniqueName: \"kubernetes.io/configmap/0d51d993-9626-45d7-b558-34c417afa63a-openstack-scripts\") pod \"openstackclient\" (UID: \"0d51d993-9626-45d7-b558-34c417afa63a\") " pod="glance-kuttl-tests/openstackclient" Oct 11 04:19:07 crc kubenswrapper[4703]: I1011 04:19:07.254806 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/0d51d993-9626-45d7-b558-34c417afa63a-openstack-config-secret\") pod \"openstackclient\" (UID: \"0d51d993-9626-45d7-b558-34c417afa63a\") " pod="glance-kuttl-tests/openstackclient" Oct 11 04:19:07 crc kubenswrapper[4703]: I1011 04:19:07.275045 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zgx68\" (UniqueName: \"kubernetes.io/projected/0d51d993-9626-45d7-b558-34c417afa63a-kube-api-access-zgx68\") pod \"openstackclient\" (UID: \"0d51d993-9626-45d7-b558-34c417afa63a\") " pod="glance-kuttl-tests/openstackclient" Oct 11 04:19:07 crc kubenswrapper[4703]: I1011 04:19:07.382025 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/openstackclient" Oct 11 04:19:07 crc kubenswrapper[4703]: I1011 04:19:07.833649 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/openstackclient"] Oct 11 04:19:08 crc kubenswrapper[4703]: I1011 04:19:08.465737 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstackclient" event={"ID":"0d51d993-9626-45d7-b558-34c417afa63a","Type":"ContainerStarted","Data":"85a1532f4ead24ac292dd2a878f93fa9c5091b138b6f3d9777214a566f3778ca"} Oct 11 04:19:08 crc kubenswrapper[4703]: I1011 04:19:08.466153 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstackclient" event={"ID":"0d51d993-9626-45d7-b558-34c417afa63a","Type":"ContainerStarted","Data":"5df0a9303c3ed3115f41228b248d893a9d07bc2866091964530a518216bac005"} Oct 11 04:19:08 crc kubenswrapper[4703]: I1011 04:19:08.489179 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/openstackclient" podStartSLOduration=1.489153458 podStartE2EDuration="1.489153458s" podCreationTimestamp="2025-10-11 04:19:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 04:19:08.481438477 +0000 UTC m=+1439.691920449" watchObservedRunningTime="2025-10-11 04:19:08.489153458 +0000 UTC m=+1439.699635420" Oct 11 04:19:10 crc kubenswrapper[4703]: I1011 04:19:10.893595 4703 scope.go:117] "RemoveContainer" containerID="764b62471bcf937100a0c27282fc1feabbb399465d03451a1447945ee99f0f2d" Oct 11 04:19:10 crc kubenswrapper[4703]: I1011 04:19:10.924168 4703 scope.go:117] "RemoveContainer" containerID="426fd0f4fbf8ca8a7f47a8c399f356c75886ad2ad9b9600d1db3f651e0c728f4" Oct 11 04:19:10 crc kubenswrapper[4703]: I1011 04:19:10.987224 4703 scope.go:117] "RemoveContainer" containerID="e5dfb31cdbff8a220a0d32ca7d9e05a2755a1728a8137e586eec40f56ee8979c" Oct 11 04:19:11 crc kubenswrapper[4703]: I1011 04:19:11.021937 4703 scope.go:117] "RemoveContainer" containerID="71b5ade3a1c496fb558036f4a60a8d9b76747c5aa899df596f498a978c78ba64" Oct 11 04:19:11 crc kubenswrapper[4703]: I1011 04:19:11.048225 4703 scope.go:117] "RemoveContainer" containerID="56d278850949ca06162a94b6a35161a79a372c13cc2fd9cf5b0da3e5d577aa43" Oct 11 04:19:11 crc kubenswrapper[4703]: I1011 04:19:11.080071 4703 scope.go:117] "RemoveContainer" containerID="98d900138720af28fa185d7aeb07550642a81d18f2b4be1fe8c06eb44916fff2" Oct 11 04:19:11 crc kubenswrapper[4703]: I1011 04:19:11.107716 4703 scope.go:117] "RemoveContainer" containerID="471bb93793c489b912e0f8336fab4cd410e4fec5accf2264987b6cb91a61571c" Oct 11 04:19:50 crc kubenswrapper[4703]: I1011 04:19:50.254574 4703 patch_prober.go:28] interesting pod/machine-config-daemon-6b7d5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 11 04:19:50 crc kubenswrapper[4703]: I1011 04:19:50.255166 4703 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6b7d5" podUID="74f832fc-6791-47d6-a9b3-07d923e053dc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 11 04:20:20 crc kubenswrapper[4703]: I1011 04:20:20.254295 4703 patch_prober.go:28] interesting pod/machine-config-daemon-6b7d5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 11 04:20:20 crc kubenswrapper[4703]: I1011 04:20:20.254915 4703 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6b7d5" podUID="74f832fc-6791-47d6-a9b3-07d923e053dc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 11 04:20:42 crc kubenswrapper[4703]: I1011 04:20:42.571656 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-v4mnp/must-gather-r579g"] Oct 11 04:20:42 crc kubenswrapper[4703]: I1011 04:20:42.573459 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-v4mnp/must-gather-r579g" Oct 11 04:20:42 crc kubenswrapper[4703]: I1011 04:20:42.579594 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-v4mnp"/"openshift-service-ca.crt" Oct 11 04:20:42 crc kubenswrapper[4703]: I1011 04:20:42.579647 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-v4mnp"/"kube-root-ca.crt" Oct 11 04:20:42 crc kubenswrapper[4703]: I1011 04:20:42.580134 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-v4mnp"/"default-dockercfg-tq8sr" Oct 11 04:20:42 crc kubenswrapper[4703]: I1011 04:20:42.585611 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/421dd176-5f25-41b4-a877-20d568295ed3-must-gather-output\") pod \"must-gather-r579g\" (UID: \"421dd176-5f25-41b4-a877-20d568295ed3\") " pod="openshift-must-gather-v4mnp/must-gather-r579g" Oct 11 04:20:42 crc kubenswrapper[4703]: I1011 04:20:42.585768 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j7s8d\" (UniqueName: \"kubernetes.io/projected/421dd176-5f25-41b4-a877-20d568295ed3-kube-api-access-j7s8d\") pod \"must-gather-r579g\" (UID: \"421dd176-5f25-41b4-a877-20d568295ed3\") " pod="openshift-must-gather-v4mnp/must-gather-r579g" Oct 11 04:20:42 crc kubenswrapper[4703]: I1011 04:20:42.589706 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-v4mnp/must-gather-r579g"] Oct 11 04:20:42 crc kubenswrapper[4703]: I1011 04:20:42.686642 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j7s8d\" (UniqueName: \"kubernetes.io/projected/421dd176-5f25-41b4-a877-20d568295ed3-kube-api-access-j7s8d\") pod \"must-gather-r579g\" (UID: \"421dd176-5f25-41b4-a877-20d568295ed3\") " pod="openshift-must-gather-v4mnp/must-gather-r579g" Oct 11 04:20:42 crc kubenswrapper[4703]: I1011 04:20:42.686709 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/421dd176-5f25-41b4-a877-20d568295ed3-must-gather-output\") pod \"must-gather-r579g\" (UID: \"421dd176-5f25-41b4-a877-20d568295ed3\") " pod="openshift-must-gather-v4mnp/must-gather-r579g" Oct 11 04:20:42 crc kubenswrapper[4703]: I1011 04:20:42.687125 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/421dd176-5f25-41b4-a877-20d568295ed3-must-gather-output\") pod \"must-gather-r579g\" (UID: \"421dd176-5f25-41b4-a877-20d568295ed3\") " pod="openshift-must-gather-v4mnp/must-gather-r579g" Oct 11 04:20:42 crc kubenswrapper[4703]: I1011 04:20:42.706989 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j7s8d\" (UniqueName: \"kubernetes.io/projected/421dd176-5f25-41b4-a877-20d568295ed3-kube-api-access-j7s8d\") pod \"must-gather-r579g\" (UID: \"421dd176-5f25-41b4-a877-20d568295ed3\") " pod="openshift-must-gather-v4mnp/must-gather-r579g" Oct 11 04:20:42 crc kubenswrapper[4703]: I1011 04:20:42.891572 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-v4mnp/must-gather-r579g" Oct 11 04:20:43 crc kubenswrapper[4703]: I1011 04:20:43.379960 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-v4mnp/must-gather-r579g"] Oct 11 04:20:43 crc kubenswrapper[4703]: I1011 04:20:43.416717 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-v4mnp/must-gather-r579g" event={"ID":"421dd176-5f25-41b4-a877-20d568295ed3","Type":"ContainerStarted","Data":"37d8a5101ee70e1156feaee5b1e7fd7047f587f776f7634c145181d73810c091"} Oct 11 04:20:47 crc kubenswrapper[4703]: I1011 04:20:47.447646 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-v4mnp/must-gather-r579g" event={"ID":"421dd176-5f25-41b4-a877-20d568295ed3","Type":"ContainerStarted","Data":"7481354e387b3e9594f95b9a1c85de5919f147445507e9be31667f0ca283336f"} Oct 11 04:20:48 crc kubenswrapper[4703]: I1011 04:20:48.461171 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-v4mnp/must-gather-r579g" event={"ID":"421dd176-5f25-41b4-a877-20d568295ed3","Type":"ContainerStarted","Data":"8d4830f507cb3955a8746a1603f27d67c9c03b3959d1e90d24479ecde6e334d8"} Oct 11 04:20:48 crc kubenswrapper[4703]: I1011 04:20:48.479718 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-v4mnp/must-gather-r579g" podStartSLOduration=2.833762081 podStartE2EDuration="6.479698281s" podCreationTimestamp="2025-10-11 04:20:42 +0000 UTC" firstStartedPulling="2025-10-11 04:20:43.387828871 +0000 UTC m=+1534.598310793" lastFinishedPulling="2025-10-11 04:20:47.033765061 +0000 UTC m=+1538.244246993" observedRunningTime="2025-10-11 04:20:48.473494779 +0000 UTC m=+1539.683976701" watchObservedRunningTime="2025-10-11 04:20:48.479698281 +0000 UTC m=+1539.690180203" Oct 11 04:20:50 crc kubenswrapper[4703]: I1011 04:20:50.254783 4703 patch_prober.go:28] interesting pod/machine-config-daemon-6b7d5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 11 04:20:50 crc kubenswrapper[4703]: I1011 04:20:50.254849 4703 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6b7d5" podUID="74f832fc-6791-47d6-a9b3-07d923e053dc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 11 04:20:50 crc kubenswrapper[4703]: I1011 04:20:50.254902 4703 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-6b7d5" Oct 11 04:20:50 crc kubenswrapper[4703]: I1011 04:20:50.255634 4703 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"93574af0cf2738114bd5085b74182463f92f2f4ce3cf1ce5560bcf2bff6e2eae"} pod="openshift-machine-config-operator/machine-config-daemon-6b7d5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 11 04:20:50 crc kubenswrapper[4703]: I1011 04:20:50.255702 4703 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-6b7d5" podUID="74f832fc-6791-47d6-a9b3-07d923e053dc" containerName="machine-config-daemon" containerID="cri-o://93574af0cf2738114bd5085b74182463f92f2f4ce3cf1ce5560bcf2bff6e2eae" gracePeriod=600 Oct 11 04:20:50 crc kubenswrapper[4703]: E1011 04:20:50.388457 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6b7d5_openshift-machine-config-operator(74f832fc-6791-47d6-a9b3-07d923e053dc)\"" pod="openshift-machine-config-operator/machine-config-daemon-6b7d5" podUID="74f832fc-6791-47d6-a9b3-07d923e053dc" Oct 11 04:20:50 crc kubenswrapper[4703]: I1011 04:20:50.475211 4703 generic.go:334] "Generic (PLEG): container finished" podID="74f832fc-6791-47d6-a9b3-07d923e053dc" containerID="93574af0cf2738114bd5085b74182463f92f2f4ce3cf1ce5560bcf2bff6e2eae" exitCode=0 Oct 11 04:20:50 crc kubenswrapper[4703]: I1011 04:20:50.475254 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6b7d5" event={"ID":"74f832fc-6791-47d6-a9b3-07d923e053dc","Type":"ContainerDied","Data":"93574af0cf2738114bd5085b74182463f92f2f4ce3cf1ce5560bcf2bff6e2eae"} Oct 11 04:20:50 crc kubenswrapper[4703]: I1011 04:20:50.475286 4703 scope.go:117] "RemoveContainer" containerID="47760bd5e163e98f5c9c154e6e147294e18b376d4efe7d50aef86addf0f66969" Oct 11 04:20:50 crc kubenswrapper[4703]: I1011 04:20:50.476241 4703 scope.go:117] "RemoveContainer" containerID="93574af0cf2738114bd5085b74182463f92f2f4ce3cf1ce5560bcf2bff6e2eae" Oct 11 04:20:50 crc kubenswrapper[4703]: E1011 04:20:50.476676 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6b7d5_openshift-machine-config-operator(74f832fc-6791-47d6-a9b3-07d923e053dc)\"" pod="openshift-machine-config-operator/machine-config-daemon-6b7d5" podUID="74f832fc-6791-47d6-a9b3-07d923e053dc" Oct 11 04:21:05 crc kubenswrapper[4703]: I1011 04:21:05.534370 4703 scope.go:117] "RemoveContainer" containerID="93574af0cf2738114bd5085b74182463f92f2f4ce3cf1ce5560bcf2bff6e2eae" Oct 11 04:21:05 crc kubenswrapper[4703]: E1011 04:21:05.537223 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6b7d5_openshift-machine-config-operator(74f832fc-6791-47d6-a9b3-07d923e053dc)\"" pod="openshift-machine-config-operator/machine-config-daemon-6b7d5" podUID="74f832fc-6791-47d6-a9b3-07d923e053dc" Oct 11 04:21:18 crc kubenswrapper[4703]: I1011 04:21:18.533255 4703 scope.go:117] "RemoveContainer" containerID="93574af0cf2738114bd5085b74182463f92f2f4ce3cf1ce5560bcf2bff6e2eae" Oct 11 04:21:18 crc kubenswrapper[4703]: E1011 04:21:18.534231 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6b7d5_openshift-machine-config-operator(74f832fc-6791-47d6-a9b3-07d923e053dc)\"" pod="openshift-machine-config-operator/machine-config-daemon-6b7d5" podUID="74f832fc-6791-47d6-a9b3-07d923e053dc" Oct 11 04:21:22 crc kubenswrapper[4703]: I1011 04:21:22.926426 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_10f57ec60de4df58ed39de93369cc80174e5ad08476bc9cf01944ad890zc2gm_7e7951b7-f9d2-4e34-a4ec-c64753ec9a26/util/0.log" Oct 11 04:21:23 crc kubenswrapper[4703]: I1011 04:21:23.088881 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_10f57ec60de4df58ed39de93369cc80174e5ad08476bc9cf01944ad890zc2gm_7e7951b7-f9d2-4e34-a4ec-c64753ec9a26/pull/0.log" Oct 11 04:21:23 crc kubenswrapper[4703]: I1011 04:21:23.089171 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_10f57ec60de4df58ed39de93369cc80174e5ad08476bc9cf01944ad890zc2gm_7e7951b7-f9d2-4e34-a4ec-c64753ec9a26/util/0.log" Oct 11 04:21:23 crc kubenswrapper[4703]: I1011 04:21:23.121043 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_10f57ec60de4df58ed39de93369cc80174e5ad08476bc9cf01944ad890zc2gm_7e7951b7-f9d2-4e34-a4ec-c64753ec9a26/pull/0.log" Oct 11 04:21:23 crc kubenswrapper[4703]: I1011 04:21:23.258634 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_10f57ec60de4df58ed39de93369cc80174e5ad08476bc9cf01944ad890zc2gm_7e7951b7-f9d2-4e34-a4ec-c64753ec9a26/pull/0.log" Oct 11 04:21:23 crc kubenswrapper[4703]: I1011 04:21:23.284534 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_10f57ec60de4df58ed39de93369cc80174e5ad08476bc9cf01944ad890zc2gm_7e7951b7-f9d2-4e34-a4ec-c64753ec9a26/util/0.log" Oct 11 04:21:23 crc kubenswrapper[4703]: I1011 04:21:23.313479 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_10f57ec60de4df58ed39de93369cc80174e5ad08476bc9cf01944ad890zc2gm_7e7951b7-f9d2-4e34-a4ec-c64753ec9a26/extract/0.log" Oct 11 04:21:23 crc kubenswrapper[4703]: I1011 04:21:23.420484 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_8cd0131c5217a421e1cdc6f3fe4164e1b274d2f5d714c8e430ff4eabd1pv9hf_6e907691-72dc-445b-a57f-e70d51d3877c/util/0.log" Oct 11 04:21:23 crc kubenswrapper[4703]: I1011 04:21:23.575556 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_8cd0131c5217a421e1cdc6f3fe4164e1b274d2f5d714c8e430ff4eabd1pv9hf_6e907691-72dc-445b-a57f-e70d51d3877c/pull/0.log" Oct 11 04:21:23 crc kubenswrapper[4703]: I1011 04:21:23.579099 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_8cd0131c5217a421e1cdc6f3fe4164e1b274d2f5d714c8e430ff4eabd1pv9hf_6e907691-72dc-445b-a57f-e70d51d3877c/pull/0.log" Oct 11 04:21:23 crc kubenswrapper[4703]: I1011 04:21:23.579839 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_8cd0131c5217a421e1cdc6f3fe4164e1b274d2f5d714c8e430ff4eabd1pv9hf_6e907691-72dc-445b-a57f-e70d51d3877c/util/0.log" Oct 11 04:21:23 crc kubenswrapper[4703]: I1011 04:21:23.770196 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_8cd0131c5217a421e1cdc6f3fe4164e1b274d2f5d714c8e430ff4eabd1pv9hf_6e907691-72dc-445b-a57f-e70d51d3877c/extract/0.log" Oct 11 04:21:23 crc kubenswrapper[4703]: I1011 04:21:23.777252 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_8cd0131c5217a421e1cdc6f3fe4164e1b274d2f5d714c8e430ff4eabd1pv9hf_6e907691-72dc-445b-a57f-e70d51d3877c/pull/0.log" Oct 11 04:21:23 crc kubenswrapper[4703]: I1011 04:21:23.786360 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_8cd0131c5217a421e1cdc6f3fe4164e1b274d2f5d714c8e430ff4eabd1pv9hf_6e907691-72dc-445b-a57f-e70d51d3877c/util/0.log" Oct 11 04:21:24 crc kubenswrapper[4703]: I1011 04:21:24.033394 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590m778n_43dd0973-48f7-42ac-b4a9-fa2373381562/util/0.log" Oct 11 04:21:24 crc kubenswrapper[4703]: I1011 04:21:24.147732 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590m778n_43dd0973-48f7-42ac-b4a9-fa2373381562/util/0.log" Oct 11 04:21:24 crc kubenswrapper[4703]: I1011 04:21:24.169248 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590m778n_43dd0973-48f7-42ac-b4a9-fa2373381562/pull/0.log" Oct 11 04:21:24 crc kubenswrapper[4703]: I1011 04:21:24.177210 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590m778n_43dd0973-48f7-42ac-b4a9-fa2373381562/pull/0.log" Oct 11 04:21:24 crc kubenswrapper[4703]: I1011 04:21:24.348344 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590m778n_43dd0973-48f7-42ac-b4a9-fa2373381562/util/0.log" Oct 11 04:21:24 crc kubenswrapper[4703]: I1011 04:21:24.350597 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590m778n_43dd0973-48f7-42ac-b4a9-fa2373381562/pull/0.log" Oct 11 04:21:24 crc kubenswrapper[4703]: I1011 04:21:24.376842 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590m778n_43dd0973-48f7-42ac-b4a9-fa2373381562/extract/0.log" Oct 11 04:21:24 crc kubenswrapper[4703]: I1011 04:21:24.498927 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_b7432b27cbf11f6c9ff99e58a4689d446a54abecf892186881fc53991fcxfcw_3b3d5f97-7702-47e8-836c-fd713dbf3070/util/0.log" Oct 11 04:21:24 crc kubenswrapper[4703]: I1011 04:21:24.662180 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_b7432b27cbf11f6c9ff99e58a4689d446a54abecf892186881fc53991fcxfcw_3b3d5f97-7702-47e8-836c-fd713dbf3070/pull/0.log" Oct 11 04:21:24 crc kubenswrapper[4703]: I1011 04:21:24.662239 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_b7432b27cbf11f6c9ff99e58a4689d446a54abecf892186881fc53991fcxfcw_3b3d5f97-7702-47e8-836c-fd713dbf3070/pull/0.log" Oct 11 04:21:24 crc kubenswrapper[4703]: I1011 04:21:24.673302 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_b7432b27cbf11f6c9ff99e58a4689d446a54abecf892186881fc53991fcxfcw_3b3d5f97-7702-47e8-836c-fd713dbf3070/util/0.log" Oct 11 04:21:24 crc kubenswrapper[4703]: I1011 04:21:24.832481 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_b7432b27cbf11f6c9ff99e58a4689d446a54abecf892186881fc53991fcxfcw_3b3d5f97-7702-47e8-836c-fd713dbf3070/util/0.log" Oct 11 04:21:24 crc kubenswrapper[4703]: I1011 04:21:24.865959 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_b7432b27cbf11f6c9ff99e58a4689d446a54abecf892186881fc53991fcxfcw_3b3d5f97-7702-47e8-836c-fd713dbf3070/extract/0.log" Oct 11 04:21:24 crc kubenswrapper[4703]: I1011 04:21:24.885382 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_b7432b27cbf11f6c9ff99e58a4689d446a54abecf892186881fc53991fcxfcw_3b3d5f97-7702-47e8-836c-fd713dbf3070/pull/0.log" Oct 11 04:21:25 crc kubenswrapper[4703]: I1011 04:21:25.038054 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_c090bdbd66b297a4e0a96dc60ec40a8cadf15db03c232395ee40d8ae6cl6jss_4fc4bd20-b803-4781-92ce-d7ecf349398d/util/0.log" Oct 11 04:21:25 crc kubenswrapper[4703]: I1011 04:21:25.185803 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_c090bdbd66b297a4e0a96dc60ec40a8cadf15db03c232395ee40d8ae6cl6jss_4fc4bd20-b803-4781-92ce-d7ecf349398d/util/0.log" Oct 11 04:21:25 crc kubenswrapper[4703]: I1011 04:21:25.191719 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_c090bdbd66b297a4e0a96dc60ec40a8cadf15db03c232395ee40d8ae6cl6jss_4fc4bd20-b803-4781-92ce-d7ecf349398d/pull/0.log" Oct 11 04:21:25 crc kubenswrapper[4703]: I1011 04:21:25.236005 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_c090bdbd66b297a4e0a96dc60ec40a8cadf15db03c232395ee40d8ae6cl6jss_4fc4bd20-b803-4781-92ce-d7ecf349398d/pull/0.log" Oct 11 04:21:25 crc kubenswrapper[4703]: I1011 04:21:25.350025 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_c090bdbd66b297a4e0a96dc60ec40a8cadf15db03c232395ee40d8ae6cl6jss_4fc4bd20-b803-4781-92ce-d7ecf349398d/extract/0.log" Oct 11 04:21:25 crc kubenswrapper[4703]: I1011 04:21:25.359349 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_c090bdbd66b297a4e0a96dc60ec40a8cadf15db03c232395ee40d8ae6cl6jss_4fc4bd20-b803-4781-92ce-d7ecf349398d/util/0.log" Oct 11 04:21:25 crc kubenswrapper[4703]: I1011 04:21:25.413612 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_c090bdbd66b297a4e0a96dc60ec40a8cadf15db03c232395ee40d8ae6cl6jss_4fc4bd20-b803-4781-92ce-d7ecf349398d/pull/0.log" Oct 11 04:21:25 crc kubenswrapper[4703]: I1011 04:21:25.494555 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_d41cb49d6d43f96b279e216261e20a64fc2229fe8c9b730010e4cae654z6brz_f072bbf3-5db0-44fa-853c-fb41e667cc6e/util/0.log" Oct 11 04:21:25 crc kubenswrapper[4703]: I1011 04:21:25.689768 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_d41cb49d6d43f96b279e216261e20a64fc2229fe8c9b730010e4cae654z6brz_f072bbf3-5db0-44fa-853c-fb41e667cc6e/util/0.log" Oct 11 04:21:25 crc kubenswrapper[4703]: I1011 04:21:25.705041 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_d41cb49d6d43f96b279e216261e20a64fc2229fe8c9b730010e4cae654z6brz_f072bbf3-5db0-44fa-853c-fb41e667cc6e/pull/0.log" Oct 11 04:21:25 crc kubenswrapper[4703]: I1011 04:21:25.732274 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_d41cb49d6d43f96b279e216261e20a64fc2229fe8c9b730010e4cae654z6brz_f072bbf3-5db0-44fa-853c-fb41e667cc6e/pull/0.log" Oct 11 04:21:25 crc kubenswrapper[4703]: I1011 04:21:25.857586 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_d41cb49d6d43f96b279e216261e20a64fc2229fe8c9b730010e4cae654z6brz_f072bbf3-5db0-44fa-853c-fb41e667cc6e/pull/0.log" Oct 11 04:21:25 crc kubenswrapper[4703]: I1011 04:21:25.859757 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_d41cb49d6d43f96b279e216261e20a64fc2229fe8c9b730010e4cae654z6brz_f072bbf3-5db0-44fa-853c-fb41e667cc6e/util/0.log" Oct 11 04:21:25 crc kubenswrapper[4703]: I1011 04:21:25.876614 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_d41cb49d6d43f96b279e216261e20a64fc2229fe8c9b730010e4cae654z6brz_f072bbf3-5db0-44fa-853c-fb41e667cc6e/extract/0.log" Oct 11 04:21:25 crc kubenswrapper[4703]: I1011 04:21:25.927601 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_f2628378dfbe9d43c4c77358844c5bb7d39b0ec6a549d0614459ab45beprvfw_6d96e09e-5277-45d1-83d1-2230caf6a714/util/0.log" Oct 11 04:21:26 crc kubenswrapper[4703]: I1011 04:21:26.079162 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_f2628378dfbe9d43c4c77358844c5bb7d39b0ec6a549d0614459ab45beprvfw_6d96e09e-5277-45d1-83d1-2230caf6a714/util/0.log" Oct 11 04:21:26 crc kubenswrapper[4703]: I1011 04:21:26.085009 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_f2628378dfbe9d43c4c77358844c5bb7d39b0ec6a549d0614459ab45beprvfw_6d96e09e-5277-45d1-83d1-2230caf6a714/pull/0.log" Oct 11 04:21:26 crc kubenswrapper[4703]: I1011 04:21:26.100479 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_f2628378dfbe9d43c4c77358844c5bb7d39b0ec6a549d0614459ab45beprvfw_6d96e09e-5277-45d1-83d1-2230caf6a714/pull/0.log" Oct 11 04:21:26 crc kubenswrapper[4703]: I1011 04:21:26.246553 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_f2628378dfbe9d43c4c77358844c5bb7d39b0ec6a549d0614459ab45beprvfw_6d96e09e-5277-45d1-83d1-2230caf6a714/extract/0.log" Oct 11 04:21:26 crc kubenswrapper[4703]: I1011 04:21:26.269845 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_f2628378dfbe9d43c4c77358844c5bb7d39b0ec6a549d0614459ab45beprvfw_6d96e09e-5277-45d1-83d1-2230caf6a714/util/0.log" Oct 11 04:21:26 crc kubenswrapper[4703]: I1011 04:21:26.321498 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-7f48cf958-x89d2_766465e2-cc0d-40f4-97cb-d79d92dec7eb/kube-rbac-proxy/0.log" Oct 11 04:21:26 crc kubenswrapper[4703]: I1011 04:21:26.321824 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_f2628378dfbe9d43c4c77358844c5bb7d39b0ec6a549d0614459ab45beprvfw_6d96e09e-5277-45d1-83d1-2230caf6a714/pull/0.log" Oct 11 04:21:26 crc kubenswrapper[4703]: I1011 04:21:26.490708 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-7f48cf958-x89d2_766465e2-cc0d-40f4-97cb-d79d92dec7eb/manager/0.log" Oct 11 04:21:26 crc kubenswrapper[4703]: I1011 04:21:26.504210 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-index-7gwgv_30afd833-f0d2-4249-9a5b-2c318cef5220/registry-server/0.log" Oct 11 04:21:26 crc kubenswrapper[4703]: I1011 04:21:26.516806 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-5845cf79b9-bcr44_9a73b45a-1271-42e4-9600-afee376337be/kube-rbac-proxy/0.log" Oct 11 04:21:26 crc kubenswrapper[4703]: I1011 04:21:26.633544 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-5845cf79b9-bcr44_9a73b45a-1271-42e4-9600-afee376337be/manager/0.log" Oct 11 04:21:26 crc kubenswrapper[4703]: I1011 04:21:26.670336 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-index-psk46_e5ee68c0-4760-42ad-8ff7-e1bb6ddd4f90/registry-server/0.log" Oct 11 04:21:26 crc kubenswrapper[4703]: I1011 04:21:26.714615 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-65968694dd-z4smp_d9faffc6-5f30-4e1b-94e3-49ffb44ca354/kube-rbac-proxy/0.log" Oct 11 04:21:26 crc kubenswrapper[4703]: I1011 04:21:26.819229 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-65968694dd-z4smp_d9faffc6-5f30-4e1b-94e3-49ffb44ca354/manager/0.log" Oct 11 04:21:26 crc kubenswrapper[4703]: I1011 04:21:26.856241 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-index-vmm7z_94b8f614-7bdc-4b98-9a5a-fccb2775b533/registry-server/0.log" Oct 11 04:21:26 crc kubenswrapper[4703]: I1011 04:21:26.928772 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-bc7dd8474-ppwvq_582b8e4d-9fe3-4e10-b9eb-c13c443128a8/kube-rbac-proxy/0.log" Oct 11 04:21:27 crc kubenswrapper[4703]: I1011 04:21:27.038554 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-bc7dd8474-ppwvq_582b8e4d-9fe3-4e10-b9eb-c13c443128a8/manager/0.log" Oct 11 04:21:27 crc kubenswrapper[4703]: I1011 04:21:27.104045 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-index-6r6dw_2768ed32-fa2e-42a9-b5d3-be0483d294a8/registry-server/0.log" Oct 11 04:21:27 crc kubenswrapper[4703]: I1011 04:21:27.138186 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-86ffc69b7-nwnvb_6d215033-9dda-4e97-8ba5-72dd0ecea5f9/kube-rbac-proxy/0.log" Oct 11 04:21:27 crc kubenswrapper[4703]: I1011 04:21:27.206289 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-86ffc69b7-nwnvb_6d215033-9dda-4e97-8ba5-72dd0ecea5f9/manager/0.log" Oct 11 04:21:27 crc kubenswrapper[4703]: I1011 04:21:27.266341 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-index-t642v_2cd95ee2-d109-4d57-a3b4-f0c841741119/registry-server/0.log" Oct 11 04:21:27 crc kubenswrapper[4703]: I1011 04:21:27.311668 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-779fc9694b-hwk9j_97ee2afe-4504-4602-8708-09f8ccae07dc/operator/0.log" Oct 11 04:21:27 crc kubenswrapper[4703]: I1011 04:21:27.414801 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-index-nmdsx_388c9894-d469-4e1f-a0b1-fe16001c9620/registry-server/0.log" Oct 11 04:21:27 crc kubenswrapper[4703]: I1011 04:21:27.498412 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-5985b768f7-2w5hq_6254e401-fe69-44a8-a16a-4423e12136bf/manager/0.log" Oct 11 04:21:27 crc kubenswrapper[4703]: I1011 04:21:27.515648 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-5985b768f7-2w5hq_6254e401-fe69-44a8-a16a-4423e12136bf/kube-rbac-proxy/0.log" Oct 11 04:21:27 crc kubenswrapper[4703]: I1011 04:21:27.639663 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-index-vkl46_da7dfdee-6374-4766-9600-13adbf52e3ed/registry-server/0.log" Oct 11 04:21:30 crc kubenswrapper[4703]: I1011 04:21:30.533786 4703 scope.go:117] "RemoveContainer" containerID="93574af0cf2738114bd5085b74182463f92f2f4ce3cf1ce5560bcf2bff6e2eae" Oct 11 04:21:30 crc kubenswrapper[4703]: E1011 04:21:30.534325 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6b7d5_openshift-machine-config-operator(74f832fc-6791-47d6-a9b3-07d923e053dc)\"" pod="openshift-machine-config-operator/machine-config-daemon-6b7d5" podUID="74f832fc-6791-47d6-a9b3-07d923e053dc" Oct 11 04:21:42 crc kubenswrapper[4703]: I1011 04:21:42.172631 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-8wrsq_95f6c15c-5422-4ef6-a4a8-108959afcae6/control-plane-machine-set-operator/0.log" Oct 11 04:21:42 crc kubenswrapper[4703]: I1011 04:21:42.315715 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-fn9dd_990fd4dc-4606-469a-9ced-8f434c2df124/kube-rbac-proxy/0.log" Oct 11 04:21:42 crc kubenswrapper[4703]: I1011 04:21:42.357363 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-fn9dd_990fd4dc-4606-469a-9ced-8f434c2df124/machine-api-operator/0.log" Oct 11 04:21:44 crc kubenswrapper[4703]: I1011 04:21:44.533961 4703 scope.go:117] "RemoveContainer" containerID="93574af0cf2738114bd5085b74182463f92f2f4ce3cf1ce5560bcf2bff6e2eae" Oct 11 04:21:44 crc kubenswrapper[4703]: E1011 04:21:44.534203 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6b7d5_openshift-machine-config-operator(74f832fc-6791-47d6-a9b3-07d923e053dc)\"" pod="openshift-machine-config-operator/machine-config-daemon-6b7d5" podUID="74f832fc-6791-47d6-a9b3-07d923e053dc" Oct 11 04:21:55 crc kubenswrapper[4703]: I1011 04:21:55.534117 4703 scope.go:117] "RemoveContainer" containerID="93574af0cf2738114bd5085b74182463f92f2f4ce3cf1ce5560bcf2bff6e2eae" Oct 11 04:21:55 crc kubenswrapper[4703]: E1011 04:21:55.534852 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6b7d5_openshift-machine-config-operator(74f832fc-6791-47d6-a9b3-07d923e053dc)\"" pod="openshift-machine-config-operator/machine-config-daemon-6b7d5" podUID="74f832fc-6791-47d6-a9b3-07d923e053dc" Oct 11 04:21:57 crc kubenswrapper[4703]: I1011 04:21:57.772949 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-68d546b9d8-b6svk_d772ffa5-ebd2-4e95-bf1f-9f3b01719c0c/controller/0.log" Oct 11 04:21:57 crc kubenswrapper[4703]: I1011 04:21:57.780760 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-68d546b9d8-b6svk_d772ffa5-ebd2-4e95-bf1f-9f3b01719c0c/kube-rbac-proxy/0.log" Oct 11 04:21:57 crc kubenswrapper[4703]: I1011 04:21:57.891439 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-n642b_5b7b0197-921b-4b00-b918-782fad0911ce/cp-frr-files/0.log" Oct 11 04:21:58 crc kubenswrapper[4703]: I1011 04:21:58.050661 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-n642b_5b7b0197-921b-4b00-b918-782fad0911ce/cp-frr-files/0.log" Oct 11 04:21:58 crc kubenswrapper[4703]: I1011 04:21:58.077822 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-n642b_5b7b0197-921b-4b00-b918-782fad0911ce/cp-metrics/0.log" Oct 11 04:21:58 crc kubenswrapper[4703]: I1011 04:21:58.083277 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-n642b_5b7b0197-921b-4b00-b918-782fad0911ce/cp-reloader/0.log" Oct 11 04:21:58 crc kubenswrapper[4703]: I1011 04:21:58.130672 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-n642b_5b7b0197-921b-4b00-b918-782fad0911ce/cp-reloader/0.log" Oct 11 04:21:58 crc kubenswrapper[4703]: I1011 04:21:58.227313 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-n642b_5b7b0197-921b-4b00-b918-782fad0911ce/cp-frr-files/0.log" Oct 11 04:21:58 crc kubenswrapper[4703]: I1011 04:21:58.248638 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-n642b_5b7b0197-921b-4b00-b918-782fad0911ce/cp-reloader/0.log" Oct 11 04:21:58 crc kubenswrapper[4703]: I1011 04:21:58.256438 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-n642b_5b7b0197-921b-4b00-b918-782fad0911ce/cp-metrics/0.log" Oct 11 04:21:58 crc kubenswrapper[4703]: I1011 04:21:58.344657 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-n642b_5b7b0197-921b-4b00-b918-782fad0911ce/cp-metrics/0.log" Oct 11 04:21:58 crc kubenswrapper[4703]: I1011 04:21:58.486247 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-n642b_5b7b0197-921b-4b00-b918-782fad0911ce/cp-reloader/0.log" Oct 11 04:21:58 crc kubenswrapper[4703]: I1011 04:21:58.486394 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-n642b_5b7b0197-921b-4b00-b918-782fad0911ce/cp-metrics/0.log" Oct 11 04:21:58 crc kubenswrapper[4703]: I1011 04:21:58.493576 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-n642b_5b7b0197-921b-4b00-b918-782fad0911ce/cp-frr-files/0.log" Oct 11 04:21:58 crc kubenswrapper[4703]: I1011 04:21:58.528570 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-n642b_5b7b0197-921b-4b00-b918-782fad0911ce/controller/0.log" Oct 11 04:21:58 crc kubenswrapper[4703]: I1011 04:21:58.709825 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-n642b_5b7b0197-921b-4b00-b918-782fad0911ce/kube-rbac-proxy/0.log" Oct 11 04:21:58 crc kubenswrapper[4703]: I1011 04:21:58.719727 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-n642b_5b7b0197-921b-4b00-b918-782fad0911ce/kube-rbac-proxy-frr/0.log" Oct 11 04:21:58 crc kubenswrapper[4703]: I1011 04:21:58.756114 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-n642b_5b7b0197-921b-4b00-b918-782fad0911ce/frr-metrics/0.log" Oct 11 04:21:59 crc kubenswrapper[4703]: I1011 04:21:59.017911 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-n642b_5b7b0197-921b-4b00-b918-782fad0911ce/frr/0.log" Oct 11 04:21:59 crc kubenswrapper[4703]: I1011 04:21:59.060635 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-64bf5d555-sdbdc_7fabba07-23a6-4cb6-9e64-7b1ff55e3852/frr-k8s-webhook-server/0.log" Oct 11 04:21:59 crc kubenswrapper[4703]: I1011 04:21:59.068721 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-n642b_5b7b0197-921b-4b00-b918-782fad0911ce/reloader/0.log" Oct 11 04:21:59 crc kubenswrapper[4703]: I1011 04:21:59.202061 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-6b46999587-p89xp_35cd67ee-b670-40f8-a7d5-7034e560930a/manager/0.log" Oct 11 04:21:59 crc kubenswrapper[4703]: I1011 04:21:59.250556 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-79b78bfd4c-brzbh_40d928a1-8704-4635-a95b-930bac6ac447/webhook-server/0.log" Oct 11 04:21:59 crc kubenswrapper[4703]: I1011 04:21:59.377886 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-n8sml_5ee9d2ec-7231-499f-818f-135260d80201/kube-rbac-proxy/0.log" Oct 11 04:21:59 crc kubenswrapper[4703]: I1011 04:21:59.510616 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-n8sml_5ee9d2ec-7231-499f-818f-135260d80201/speaker/0.log" Oct 11 04:22:06 crc kubenswrapper[4703]: I1011 04:22:06.534069 4703 scope.go:117] "RemoveContainer" containerID="93574af0cf2738114bd5085b74182463f92f2f4ce3cf1ce5560bcf2bff6e2eae" Oct 11 04:22:06 crc kubenswrapper[4703]: E1011 04:22:06.534862 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6b7d5_openshift-machine-config-operator(74f832fc-6791-47d6-a9b3-07d923e053dc)\"" pod="openshift-machine-config-operator/machine-config-daemon-6b7d5" podUID="74f832fc-6791-47d6-a9b3-07d923e053dc" Oct 11 04:22:13 crc kubenswrapper[4703]: I1011 04:22:13.474029 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_glance-a5f8-account-create-9t85t_37f46686-d548-480b-871b-208b94805891/mariadb-account-create/0.log" Oct 11 04:22:13 crc kubenswrapper[4703]: I1011 04:22:13.633484 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_glance-db-create-l7tvf_536e9988-8d28-4fe8-8528-5857c7394d83/mariadb-database-create/0.log" Oct 11 04:22:13 crc kubenswrapper[4703]: I1011 04:22:13.706374 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_glance-db-sync-qxhrr_8738d157-5a11-42ce-afd3-edfadaa58f85/glance-db-sync/0.log" Oct 11 04:22:13 crc kubenswrapper[4703]: I1011 04:22:13.838326 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_glance-default-external-api-0_360f7d83-4ffd-4ee4-841d-76f2c50f7e7a/glance-api/0.log" Oct 11 04:22:13 crc kubenswrapper[4703]: I1011 04:22:13.878439 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_glance-default-external-api-0_360f7d83-4ffd-4ee4-841d-76f2c50f7e7a/glance-httpd/0.log" Oct 11 04:22:13 crc kubenswrapper[4703]: I1011 04:22:13.900810 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_glance-default-external-api-0_360f7d83-4ffd-4ee4-841d-76f2c50f7e7a/glance-log/0.log" Oct 11 04:22:14 crc kubenswrapper[4703]: I1011 04:22:14.083890 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_glance-default-internal-api-0_e0cfe194-3f37-44ca-b774-d8c583d7b2df/glance-api/0.log" Oct 11 04:22:14 crc kubenswrapper[4703]: I1011 04:22:14.098483 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_glance-default-internal-api-0_e0cfe194-3f37-44ca-b774-d8c583d7b2df/glance-httpd/0.log" Oct 11 04:22:14 crc kubenswrapper[4703]: I1011 04:22:14.121730 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_glance-default-internal-api-0_e0cfe194-3f37-44ca-b774-d8c583d7b2df/glance-log/0.log" Oct 11 04:22:14 crc kubenswrapper[4703]: I1011 04:22:14.415662 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_memcached-0_61bf3124-e977-402f-908c-85675bcd26ed/memcached/0.log" Oct 11 04:22:14 crc kubenswrapper[4703]: I1011 04:22:14.481381 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_keystone-76f545d5bb-kbdgh_8e0390a0-7f33-42a4-9657-3796ac67f9a5/keystone-api/0.log" Oct 11 04:22:14 crc kubenswrapper[4703]: I1011 04:22:14.537372 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_openstack-galera-0_54089f16-2b27-4774-bacd-faf623efc8a0/mysql-bootstrap/0.log" Oct 11 04:22:14 crc kubenswrapper[4703]: I1011 04:22:14.685186 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_openstack-galera-0_54089f16-2b27-4774-bacd-faf623efc8a0/mysql-bootstrap/0.log" Oct 11 04:22:14 crc kubenswrapper[4703]: I1011 04:22:14.727885 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_openstack-galera-1_6973f9a3-172f-4f91-8f2c-4e14b7ee07c2/mysql-bootstrap/0.log" Oct 11 04:22:14 crc kubenswrapper[4703]: I1011 04:22:14.730154 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_openstack-galera-0_54089f16-2b27-4774-bacd-faf623efc8a0/galera/0.log" Oct 11 04:22:15 crc kubenswrapper[4703]: I1011 04:22:15.006198 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_openstack-galera-1_6973f9a3-172f-4f91-8f2c-4e14b7ee07c2/galera/0.log" Oct 11 04:22:15 crc kubenswrapper[4703]: I1011 04:22:15.012224 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_openstack-galera-1_6973f9a3-172f-4f91-8f2c-4e14b7ee07c2/mysql-bootstrap/0.log" Oct 11 04:22:15 crc kubenswrapper[4703]: I1011 04:22:15.019291 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_openstack-galera-2_64938774-1b02-463f-96e3-451096b692d6/mysql-bootstrap/0.log" Oct 11 04:22:15 crc kubenswrapper[4703]: I1011 04:22:15.205734 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_openstack-galera-2_64938774-1b02-463f-96e3-451096b692d6/mysql-bootstrap/0.log" Oct 11 04:22:15 crc kubenswrapper[4703]: I1011 04:22:15.216920 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_openstack-galera-2_64938774-1b02-463f-96e3-451096b692d6/galera/0.log" Oct 11 04:22:15 crc kubenswrapper[4703]: I1011 04:22:15.259949 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_openstackclient_0d51d993-9626-45d7-b558-34c417afa63a/openstackclient/0.log" Oct 11 04:22:15 crc kubenswrapper[4703]: I1011 04:22:15.387536 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_rabbitmq-server-0_ae0ef79e-e4aa-49b4-a412-f56b3e90c4a8/setup-container/0.log" Oct 11 04:22:15 crc kubenswrapper[4703]: I1011 04:22:15.585687 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_rabbitmq-server-0_ae0ef79e-e4aa-49b4-a412-f56b3e90c4a8/setup-container/0.log" Oct 11 04:22:15 crc kubenswrapper[4703]: I1011 04:22:15.628340 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_rabbitmq-server-0_ae0ef79e-e4aa-49b4-a412-f56b3e90c4a8/rabbitmq/0.log" Oct 11 04:22:15 crc kubenswrapper[4703]: I1011 04:22:15.644856 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_swift-proxy-6dd8f59749-c7s8q_e0cb8f8e-afe1-4bc0-af34-bd30cf77e7e6/proxy-httpd/0.log" Oct 11 04:22:15 crc kubenswrapper[4703]: I1011 04:22:15.753346 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_swift-proxy-6dd8f59749-c7s8q_e0cb8f8e-afe1-4bc0-af34-bd30cf77e7e6/proxy-server/0.log" Oct 11 04:22:15 crc kubenswrapper[4703]: I1011 04:22:15.847686 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_swift-ring-rebalance-q8xj5_76b1c0ac-2d3b-4d7b-ab5e-61044e6c92bd/swift-ring-rebalance/0.log" Oct 11 04:22:15 crc kubenswrapper[4703]: I1011 04:22:15.936289 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_swift-storage-0_edaa5850-8c67-4739-8aa9-b02eff7e3291/account-auditor/0.log" Oct 11 04:22:15 crc kubenswrapper[4703]: I1011 04:22:15.964587 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_swift-storage-0_edaa5850-8c67-4739-8aa9-b02eff7e3291/account-reaper/0.log" Oct 11 04:22:16 crc kubenswrapper[4703]: I1011 04:22:16.007769 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_swift-storage-0_edaa5850-8c67-4739-8aa9-b02eff7e3291/account-replicator/0.log" Oct 11 04:22:16 crc kubenswrapper[4703]: I1011 04:22:16.042442 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_swift-storage-0_edaa5850-8c67-4739-8aa9-b02eff7e3291/account-server/0.log" Oct 11 04:22:16 crc kubenswrapper[4703]: I1011 04:22:16.152351 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_swift-storage-0_edaa5850-8c67-4739-8aa9-b02eff7e3291/container-replicator/0.log" Oct 11 04:22:16 crc kubenswrapper[4703]: I1011 04:22:16.158072 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_swift-storage-0_edaa5850-8c67-4739-8aa9-b02eff7e3291/container-auditor/0.log" Oct 11 04:22:16 crc kubenswrapper[4703]: I1011 04:22:16.224486 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_swift-storage-0_edaa5850-8c67-4739-8aa9-b02eff7e3291/container-server/0.log" Oct 11 04:22:16 crc kubenswrapper[4703]: I1011 04:22:16.232913 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_swift-storage-0_edaa5850-8c67-4739-8aa9-b02eff7e3291/container-updater/0.log" Oct 11 04:22:16 crc kubenswrapper[4703]: I1011 04:22:16.344561 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_swift-storage-0_edaa5850-8c67-4739-8aa9-b02eff7e3291/object-expirer/0.log" Oct 11 04:22:16 crc kubenswrapper[4703]: I1011 04:22:16.349863 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_swift-storage-0_edaa5850-8c67-4739-8aa9-b02eff7e3291/object-auditor/0.log" Oct 11 04:22:16 crc kubenswrapper[4703]: I1011 04:22:16.382329 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_swift-storage-0_edaa5850-8c67-4739-8aa9-b02eff7e3291/object-replicator/0.log" Oct 11 04:22:16 crc kubenswrapper[4703]: I1011 04:22:16.410396 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_swift-storage-0_edaa5850-8c67-4739-8aa9-b02eff7e3291/object-server/0.log" Oct 11 04:22:16 crc kubenswrapper[4703]: I1011 04:22:16.411126 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_swift-storage-0_edaa5850-8c67-4739-8aa9-b02eff7e3291/object-updater/0.log" Oct 11 04:22:16 crc kubenswrapper[4703]: I1011 04:22:16.523747 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_swift-storage-0_edaa5850-8c67-4739-8aa9-b02eff7e3291/rsync/0.log" Oct 11 04:22:16 crc kubenswrapper[4703]: I1011 04:22:16.525325 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_swift-storage-0_edaa5850-8c67-4739-8aa9-b02eff7e3291/swift-recon-cron/0.log" Oct 11 04:22:19 crc kubenswrapper[4703]: I1011 04:22:19.539131 4703 scope.go:117] "RemoveContainer" containerID="93574af0cf2738114bd5085b74182463f92f2f4ce3cf1ce5560bcf2bff6e2eae" Oct 11 04:22:19 crc kubenswrapper[4703]: E1011 04:22:19.539729 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6b7d5_openshift-machine-config-operator(74f832fc-6791-47d6-a9b3-07d923e053dc)\"" pod="openshift-machine-config-operator/machine-config-daemon-6b7d5" podUID="74f832fc-6791-47d6-a9b3-07d923e053dc" Oct 11 04:22:24 crc kubenswrapper[4703]: I1011 04:22:24.036340 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-db-create-l7tvf"] Oct 11 04:22:24 crc kubenswrapper[4703]: I1011 04:22:24.042905 4703 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-db-create-l7tvf"] Oct 11 04:22:25 crc kubenswrapper[4703]: I1011 04:22:25.545843 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="536e9988-8d28-4fe8-8528-5857c7394d83" path="/var/lib/kubelet/pods/536e9988-8d28-4fe8-8528-5857c7394d83/volumes" Oct 11 04:22:29 crc kubenswrapper[4703]: I1011 04:22:29.139280 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2cdnlp_188f1812-b442-4b58-a5f9-4251a18bea8c/util/0.log" Oct 11 04:22:29 crc kubenswrapper[4703]: I1011 04:22:29.328591 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2cdnlp_188f1812-b442-4b58-a5f9-4251a18bea8c/pull/0.log" Oct 11 04:22:29 crc kubenswrapper[4703]: I1011 04:22:29.335711 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2cdnlp_188f1812-b442-4b58-a5f9-4251a18bea8c/pull/0.log" Oct 11 04:22:29 crc kubenswrapper[4703]: I1011 04:22:29.339815 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2cdnlp_188f1812-b442-4b58-a5f9-4251a18bea8c/util/0.log" Oct 11 04:22:29 crc kubenswrapper[4703]: I1011 04:22:29.509733 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2cdnlp_188f1812-b442-4b58-a5f9-4251a18bea8c/pull/0.log" Oct 11 04:22:29 crc kubenswrapper[4703]: I1011 04:22:29.514894 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2cdnlp_188f1812-b442-4b58-a5f9-4251a18bea8c/util/0.log" Oct 11 04:22:29 crc kubenswrapper[4703]: I1011 04:22:29.530217 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2cdnlp_188f1812-b442-4b58-a5f9-4251a18bea8c/extract/0.log" Oct 11 04:22:29 crc kubenswrapper[4703]: I1011 04:22:29.647982 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-pcrkn_9b932657-cddd-4fd6-b45b-074f292386da/extract-utilities/0.log" Oct 11 04:22:29 crc kubenswrapper[4703]: I1011 04:22:29.826035 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-pcrkn_9b932657-cddd-4fd6-b45b-074f292386da/extract-utilities/0.log" Oct 11 04:22:29 crc kubenswrapper[4703]: I1011 04:22:29.827694 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-pcrkn_9b932657-cddd-4fd6-b45b-074f292386da/extract-content/0.log" Oct 11 04:22:29 crc kubenswrapper[4703]: I1011 04:22:29.843331 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-pcrkn_9b932657-cddd-4fd6-b45b-074f292386da/extract-content/0.log" Oct 11 04:22:30 crc kubenswrapper[4703]: I1011 04:22:30.016208 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-pcrkn_9b932657-cddd-4fd6-b45b-074f292386da/extract-utilities/0.log" Oct 11 04:22:30 crc kubenswrapper[4703]: I1011 04:22:30.042491 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-pcrkn_9b932657-cddd-4fd6-b45b-074f292386da/extract-content/0.log" Oct 11 04:22:30 crc kubenswrapper[4703]: I1011 04:22:30.231543 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-8zb8v_08edae5f-0299-4ef5-98d1-3f1be67bfb35/extract-utilities/0.log" Oct 11 04:22:30 crc kubenswrapper[4703]: I1011 04:22:30.464536 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-8zb8v_08edae5f-0299-4ef5-98d1-3f1be67bfb35/extract-utilities/0.log" Oct 11 04:22:30 crc kubenswrapper[4703]: I1011 04:22:30.496756 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-8zb8v_08edae5f-0299-4ef5-98d1-3f1be67bfb35/extract-content/0.log" Oct 11 04:22:30 crc kubenswrapper[4703]: I1011 04:22:30.498009 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-pcrkn_9b932657-cddd-4fd6-b45b-074f292386da/registry-server/0.log" Oct 11 04:22:30 crc kubenswrapper[4703]: I1011 04:22:30.521966 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-8zb8v_08edae5f-0299-4ef5-98d1-3f1be67bfb35/extract-content/0.log" Oct 11 04:22:30 crc kubenswrapper[4703]: I1011 04:22:30.684562 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-8zb8v_08edae5f-0299-4ef5-98d1-3f1be67bfb35/extract-utilities/0.log" Oct 11 04:22:30 crc kubenswrapper[4703]: I1011 04:22:30.707012 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-8zb8v_08edae5f-0299-4ef5-98d1-3f1be67bfb35/extract-content/0.log" Oct 11 04:22:30 crc kubenswrapper[4703]: I1011 04:22:30.963758 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-8zb8v_08edae5f-0299-4ef5-98d1-3f1be67bfb35/registry-server/0.log" Oct 11 04:22:30 crc kubenswrapper[4703]: I1011 04:22:30.990784 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-dfkzg_c9b7ec35-94a9-42b2-b086-a5810df3acf3/marketplace-operator/0.log" Oct 11 04:22:31 crc kubenswrapper[4703]: I1011 04:22:31.123774 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-kwlnc_f65a18cc-3c4c-427d-b3cf-82b33c238e47/extract-utilities/0.log" Oct 11 04:22:31 crc kubenswrapper[4703]: I1011 04:22:31.309135 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-kwlnc_f65a18cc-3c4c-427d-b3cf-82b33c238e47/extract-content/0.log" Oct 11 04:22:31 crc kubenswrapper[4703]: I1011 04:22:31.335861 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-kwlnc_f65a18cc-3c4c-427d-b3cf-82b33c238e47/extract-content/0.log" Oct 11 04:22:31 crc kubenswrapper[4703]: I1011 04:22:31.361304 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-kwlnc_f65a18cc-3c4c-427d-b3cf-82b33c238e47/extract-utilities/0.log" Oct 11 04:22:31 crc kubenswrapper[4703]: I1011 04:22:31.506389 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-kwlnc_f65a18cc-3c4c-427d-b3cf-82b33c238e47/extract-content/0.log" Oct 11 04:22:31 crc kubenswrapper[4703]: I1011 04:22:31.509606 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-kwlnc_f65a18cc-3c4c-427d-b3cf-82b33c238e47/extract-utilities/0.log" Oct 11 04:22:31 crc kubenswrapper[4703]: I1011 04:22:31.536131 4703 scope.go:117] "RemoveContainer" containerID="93574af0cf2738114bd5085b74182463f92f2f4ce3cf1ce5560bcf2bff6e2eae" Oct 11 04:22:31 crc kubenswrapper[4703]: E1011 04:22:31.536996 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6b7d5_openshift-machine-config-operator(74f832fc-6791-47d6-a9b3-07d923e053dc)\"" pod="openshift-machine-config-operator/machine-config-daemon-6b7d5" podUID="74f832fc-6791-47d6-a9b3-07d923e053dc" Oct 11 04:22:31 crc kubenswrapper[4703]: I1011 04:22:31.585717 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-kwlnc_f65a18cc-3c4c-427d-b3cf-82b33c238e47/registry-server/0.log" Oct 11 04:22:31 crc kubenswrapper[4703]: I1011 04:22:31.715888 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-vhh4v_f10f15cc-0c90-45a7-8f0e-d258775abff7/extract-utilities/0.log" Oct 11 04:22:31 crc kubenswrapper[4703]: I1011 04:22:31.834242 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-vhh4v_f10f15cc-0c90-45a7-8f0e-d258775abff7/extract-utilities/0.log" Oct 11 04:22:31 crc kubenswrapper[4703]: I1011 04:22:31.875735 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-vhh4v_f10f15cc-0c90-45a7-8f0e-d258775abff7/extract-content/0.log" Oct 11 04:22:31 crc kubenswrapper[4703]: I1011 04:22:31.891911 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-vhh4v_f10f15cc-0c90-45a7-8f0e-d258775abff7/extract-content/0.log" Oct 11 04:22:32 crc kubenswrapper[4703]: I1011 04:22:32.020512 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-vhh4v_f10f15cc-0c90-45a7-8f0e-d258775abff7/extract-content/0.log" Oct 11 04:22:32 crc kubenswrapper[4703]: I1011 04:22:32.053533 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-vhh4v_f10f15cc-0c90-45a7-8f0e-d258775abff7/extract-utilities/0.log" Oct 11 04:22:32 crc kubenswrapper[4703]: I1011 04:22:32.442709 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-vhh4v_f10f15cc-0c90-45a7-8f0e-d258775abff7/registry-server/0.log" Oct 11 04:22:34 crc kubenswrapper[4703]: I1011 04:22:34.033015 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-a5f8-account-create-9t85t"] Oct 11 04:22:34 crc kubenswrapper[4703]: I1011 04:22:34.045237 4703 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-a5f8-account-create-9t85t"] Oct 11 04:22:35 crc kubenswrapper[4703]: I1011 04:22:35.543818 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="37f46686-d548-480b-871b-208b94805891" path="/var/lib/kubelet/pods/37f46686-d548-480b-871b-208b94805891/volumes" Oct 11 04:22:42 crc kubenswrapper[4703]: I1011 04:22:42.048620 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-db-sync-qxhrr"] Oct 11 04:22:42 crc kubenswrapper[4703]: I1011 04:22:42.054448 4703 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-db-sync-qxhrr"] Oct 11 04:22:43 crc kubenswrapper[4703]: I1011 04:22:43.541951 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8738d157-5a11-42ce-afd3-edfadaa58f85" path="/var/lib/kubelet/pods/8738d157-5a11-42ce-afd3-edfadaa58f85/volumes" Oct 11 04:22:45 crc kubenswrapper[4703]: I1011 04:22:45.533638 4703 scope.go:117] "RemoveContainer" containerID="93574af0cf2738114bd5085b74182463f92f2f4ce3cf1ce5560bcf2bff6e2eae" Oct 11 04:22:45 crc kubenswrapper[4703]: E1011 04:22:45.534042 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6b7d5_openshift-machine-config-operator(74f832fc-6791-47d6-a9b3-07d923e053dc)\"" pod="openshift-machine-config-operator/machine-config-daemon-6b7d5" podUID="74f832fc-6791-47d6-a9b3-07d923e053dc" Oct 11 04:22:57 crc kubenswrapper[4703]: I1011 04:22:57.535161 4703 scope.go:117] "RemoveContainer" containerID="93574af0cf2738114bd5085b74182463f92f2f4ce3cf1ce5560bcf2bff6e2eae" Oct 11 04:22:57 crc kubenswrapper[4703]: E1011 04:22:57.536055 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6b7d5_openshift-machine-config-operator(74f832fc-6791-47d6-a9b3-07d923e053dc)\"" pod="openshift-machine-config-operator/machine-config-daemon-6b7d5" podUID="74f832fc-6791-47d6-a9b3-07d923e053dc" Oct 11 04:23:10 crc kubenswrapper[4703]: I1011 04:23:10.534022 4703 scope.go:117] "RemoveContainer" containerID="93574af0cf2738114bd5085b74182463f92f2f4ce3cf1ce5560bcf2bff6e2eae" Oct 11 04:23:10 crc kubenswrapper[4703]: E1011 04:23:10.534904 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6b7d5_openshift-machine-config-operator(74f832fc-6791-47d6-a9b3-07d923e053dc)\"" pod="openshift-machine-config-operator/machine-config-daemon-6b7d5" podUID="74f832fc-6791-47d6-a9b3-07d923e053dc" Oct 11 04:23:11 crc kubenswrapper[4703]: I1011 04:23:11.299082 4703 scope.go:117] "RemoveContainer" containerID="6b40bace9527f1e967f7699094b3b7ce6e347984d02120b19bf545f47e9b59b4" Oct 11 04:23:11 crc kubenswrapper[4703]: I1011 04:23:11.319847 4703 scope.go:117] "RemoveContainer" containerID="9565ee4f72394006a67d97be215e7e6497dba9e9289761142ae08d108e8e6f9d" Oct 11 04:23:11 crc kubenswrapper[4703]: I1011 04:23:11.387373 4703 scope.go:117] "RemoveContainer" containerID="f27f7440b3661cf8aef00464f6cc56d278bc936ea66a51dc12406018eeeedc85" Oct 11 04:23:21 crc kubenswrapper[4703]: I1011 04:23:21.533489 4703 scope.go:117] "RemoveContainer" containerID="93574af0cf2738114bd5085b74182463f92f2f4ce3cf1ce5560bcf2bff6e2eae" Oct 11 04:23:21 crc kubenswrapper[4703]: E1011 04:23:21.534758 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6b7d5_openshift-machine-config-operator(74f832fc-6791-47d6-a9b3-07d923e053dc)\"" pod="openshift-machine-config-operator/machine-config-daemon-6b7d5" podUID="74f832fc-6791-47d6-a9b3-07d923e053dc" Oct 11 04:23:34 crc kubenswrapper[4703]: I1011 04:23:34.533856 4703 scope.go:117] "RemoveContainer" containerID="93574af0cf2738114bd5085b74182463f92f2f4ce3cf1ce5560bcf2bff6e2eae" Oct 11 04:23:34 crc kubenswrapper[4703]: E1011 04:23:34.534674 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6b7d5_openshift-machine-config-operator(74f832fc-6791-47d6-a9b3-07d923e053dc)\"" pod="openshift-machine-config-operator/machine-config-daemon-6b7d5" podUID="74f832fc-6791-47d6-a9b3-07d923e053dc" Oct 11 04:23:36 crc kubenswrapper[4703]: I1011 04:23:36.816977 4703 generic.go:334] "Generic (PLEG): container finished" podID="421dd176-5f25-41b4-a877-20d568295ed3" containerID="7481354e387b3e9594f95b9a1c85de5919f147445507e9be31667f0ca283336f" exitCode=0 Oct 11 04:23:36 crc kubenswrapper[4703]: I1011 04:23:36.817056 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-v4mnp/must-gather-r579g" event={"ID":"421dd176-5f25-41b4-a877-20d568295ed3","Type":"ContainerDied","Data":"7481354e387b3e9594f95b9a1c85de5919f147445507e9be31667f0ca283336f"} Oct 11 04:23:36 crc kubenswrapper[4703]: I1011 04:23:36.817773 4703 scope.go:117] "RemoveContainer" containerID="7481354e387b3e9594f95b9a1c85de5919f147445507e9be31667f0ca283336f" Oct 11 04:23:36 crc kubenswrapper[4703]: I1011 04:23:36.917971 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-v4mnp_must-gather-r579g_421dd176-5f25-41b4-a877-20d568295ed3/gather/0.log" Oct 11 04:23:43 crc kubenswrapper[4703]: I1011 04:23:43.951791 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-v4mnp/must-gather-r579g"] Oct 11 04:23:43 crc kubenswrapper[4703]: I1011 04:23:43.952682 4703 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-v4mnp/must-gather-r579g" podUID="421dd176-5f25-41b4-a877-20d568295ed3" containerName="copy" containerID="cri-o://8d4830f507cb3955a8746a1603f27d67c9c03b3959d1e90d24479ecde6e334d8" gracePeriod=2 Oct 11 04:23:43 crc kubenswrapper[4703]: I1011 04:23:43.956031 4703 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-v4mnp/must-gather-r579g"] Oct 11 04:23:44 crc kubenswrapper[4703]: I1011 04:23:44.305615 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-v4mnp_must-gather-r579g_421dd176-5f25-41b4-a877-20d568295ed3/copy/0.log" Oct 11 04:23:44 crc kubenswrapper[4703]: I1011 04:23:44.306124 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-v4mnp/must-gather-r579g" Oct 11 04:23:44 crc kubenswrapper[4703]: I1011 04:23:44.423760 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j7s8d\" (UniqueName: \"kubernetes.io/projected/421dd176-5f25-41b4-a877-20d568295ed3-kube-api-access-j7s8d\") pod \"421dd176-5f25-41b4-a877-20d568295ed3\" (UID: \"421dd176-5f25-41b4-a877-20d568295ed3\") " Oct 11 04:23:44 crc kubenswrapper[4703]: I1011 04:23:44.423886 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/421dd176-5f25-41b4-a877-20d568295ed3-must-gather-output\") pod \"421dd176-5f25-41b4-a877-20d568295ed3\" (UID: \"421dd176-5f25-41b4-a877-20d568295ed3\") " Oct 11 04:23:44 crc kubenswrapper[4703]: I1011 04:23:44.439730 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/421dd176-5f25-41b4-a877-20d568295ed3-kube-api-access-j7s8d" (OuterVolumeSpecName: "kube-api-access-j7s8d") pod "421dd176-5f25-41b4-a877-20d568295ed3" (UID: "421dd176-5f25-41b4-a877-20d568295ed3"). InnerVolumeSpecName "kube-api-access-j7s8d". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 04:23:44 crc kubenswrapper[4703]: I1011 04:23:44.512969 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/421dd176-5f25-41b4-a877-20d568295ed3-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "421dd176-5f25-41b4-a877-20d568295ed3" (UID: "421dd176-5f25-41b4-a877-20d568295ed3"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 04:23:44 crc kubenswrapper[4703]: I1011 04:23:44.525994 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j7s8d\" (UniqueName: \"kubernetes.io/projected/421dd176-5f25-41b4-a877-20d568295ed3-kube-api-access-j7s8d\") on node \"crc\" DevicePath \"\"" Oct 11 04:23:44 crc kubenswrapper[4703]: I1011 04:23:44.526048 4703 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/421dd176-5f25-41b4-a877-20d568295ed3-must-gather-output\") on node \"crc\" DevicePath \"\"" Oct 11 04:23:44 crc kubenswrapper[4703]: I1011 04:23:44.885401 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-v4mnp_must-gather-r579g_421dd176-5f25-41b4-a877-20d568295ed3/copy/0.log" Oct 11 04:23:44 crc kubenswrapper[4703]: I1011 04:23:44.885865 4703 generic.go:334] "Generic (PLEG): container finished" podID="421dd176-5f25-41b4-a877-20d568295ed3" containerID="8d4830f507cb3955a8746a1603f27d67c9c03b3959d1e90d24479ecde6e334d8" exitCode=143 Oct 11 04:23:44 crc kubenswrapper[4703]: I1011 04:23:44.885926 4703 scope.go:117] "RemoveContainer" containerID="8d4830f507cb3955a8746a1603f27d67c9c03b3959d1e90d24479ecde6e334d8" Oct 11 04:23:44 crc kubenswrapper[4703]: I1011 04:23:44.886088 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-v4mnp/must-gather-r579g" Oct 11 04:23:44 crc kubenswrapper[4703]: I1011 04:23:44.906176 4703 scope.go:117] "RemoveContainer" containerID="7481354e387b3e9594f95b9a1c85de5919f147445507e9be31667f0ca283336f" Oct 11 04:23:44 crc kubenswrapper[4703]: I1011 04:23:44.956249 4703 scope.go:117] "RemoveContainer" containerID="8d4830f507cb3955a8746a1603f27d67c9c03b3959d1e90d24479ecde6e334d8" Oct 11 04:23:44 crc kubenswrapper[4703]: E1011 04:23:44.957205 4703 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8d4830f507cb3955a8746a1603f27d67c9c03b3959d1e90d24479ecde6e334d8\": container with ID starting with 8d4830f507cb3955a8746a1603f27d67c9c03b3959d1e90d24479ecde6e334d8 not found: ID does not exist" containerID="8d4830f507cb3955a8746a1603f27d67c9c03b3959d1e90d24479ecde6e334d8" Oct 11 04:23:44 crc kubenswrapper[4703]: I1011 04:23:44.957316 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8d4830f507cb3955a8746a1603f27d67c9c03b3959d1e90d24479ecde6e334d8"} err="failed to get container status \"8d4830f507cb3955a8746a1603f27d67c9c03b3959d1e90d24479ecde6e334d8\": rpc error: code = NotFound desc = could not find container \"8d4830f507cb3955a8746a1603f27d67c9c03b3959d1e90d24479ecde6e334d8\": container with ID starting with 8d4830f507cb3955a8746a1603f27d67c9c03b3959d1e90d24479ecde6e334d8 not found: ID does not exist" Oct 11 04:23:44 crc kubenswrapper[4703]: I1011 04:23:44.957405 4703 scope.go:117] "RemoveContainer" containerID="7481354e387b3e9594f95b9a1c85de5919f147445507e9be31667f0ca283336f" Oct 11 04:23:44 crc kubenswrapper[4703]: E1011 04:23:44.957793 4703 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7481354e387b3e9594f95b9a1c85de5919f147445507e9be31667f0ca283336f\": container with ID starting with 7481354e387b3e9594f95b9a1c85de5919f147445507e9be31667f0ca283336f not found: ID does not exist" containerID="7481354e387b3e9594f95b9a1c85de5919f147445507e9be31667f0ca283336f" Oct 11 04:23:44 crc kubenswrapper[4703]: I1011 04:23:44.957872 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7481354e387b3e9594f95b9a1c85de5919f147445507e9be31667f0ca283336f"} err="failed to get container status \"7481354e387b3e9594f95b9a1c85de5919f147445507e9be31667f0ca283336f\": rpc error: code = NotFound desc = could not find container \"7481354e387b3e9594f95b9a1c85de5919f147445507e9be31667f0ca283336f\": container with ID starting with 7481354e387b3e9594f95b9a1c85de5919f147445507e9be31667f0ca283336f not found: ID does not exist" Oct 11 04:23:45 crc kubenswrapper[4703]: I1011 04:23:45.542748 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="421dd176-5f25-41b4-a877-20d568295ed3" path="/var/lib/kubelet/pods/421dd176-5f25-41b4-a877-20d568295ed3/volumes" Oct 11 04:23:46 crc kubenswrapper[4703]: I1011 04:23:46.533373 4703 scope.go:117] "RemoveContainer" containerID="93574af0cf2738114bd5085b74182463f92f2f4ce3cf1ce5560bcf2bff6e2eae" Oct 11 04:23:46 crc kubenswrapper[4703]: E1011 04:23:46.533781 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6b7d5_openshift-machine-config-operator(74f832fc-6791-47d6-a9b3-07d923e053dc)\"" pod="openshift-machine-config-operator/machine-config-daemon-6b7d5" podUID="74f832fc-6791-47d6-a9b3-07d923e053dc" Oct 11 04:24:00 crc kubenswrapper[4703]: I1011 04:24:00.534076 4703 scope.go:117] "RemoveContainer" containerID="93574af0cf2738114bd5085b74182463f92f2f4ce3cf1ce5560bcf2bff6e2eae" Oct 11 04:24:00 crc kubenswrapper[4703]: E1011 04:24:00.535190 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6b7d5_openshift-machine-config-operator(74f832fc-6791-47d6-a9b3-07d923e053dc)\"" pod="openshift-machine-config-operator/machine-config-daemon-6b7d5" podUID="74f832fc-6791-47d6-a9b3-07d923e053dc" Oct 11 04:24:12 crc kubenswrapper[4703]: I1011 04:24:12.535153 4703 scope.go:117] "RemoveContainer" containerID="93574af0cf2738114bd5085b74182463f92f2f4ce3cf1ce5560bcf2bff6e2eae" Oct 11 04:24:12 crc kubenswrapper[4703]: E1011 04:24:12.537434 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6b7d5_openshift-machine-config-operator(74f832fc-6791-47d6-a9b3-07d923e053dc)\"" pod="openshift-machine-config-operator/machine-config-daemon-6b7d5" podUID="74f832fc-6791-47d6-a9b3-07d923e053dc" Oct 11 04:24:25 crc kubenswrapper[4703]: I1011 04:24:25.534009 4703 scope.go:117] "RemoveContainer" containerID="93574af0cf2738114bd5085b74182463f92f2f4ce3cf1ce5560bcf2bff6e2eae" Oct 11 04:24:25 crc kubenswrapper[4703]: E1011 04:24:25.534895 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6b7d5_openshift-machine-config-operator(74f832fc-6791-47d6-a9b3-07d923e053dc)\"" pod="openshift-machine-config-operator/machine-config-daemon-6b7d5" podUID="74f832fc-6791-47d6-a9b3-07d923e053dc" Oct 11 04:24:36 crc kubenswrapper[4703]: I1011 04:24:36.533610 4703 scope.go:117] "RemoveContainer" containerID="93574af0cf2738114bd5085b74182463f92f2f4ce3cf1ce5560bcf2bff6e2eae" Oct 11 04:24:36 crc kubenswrapper[4703]: E1011 04:24:36.534448 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6b7d5_openshift-machine-config-operator(74f832fc-6791-47d6-a9b3-07d923e053dc)\"" pod="openshift-machine-config-operator/machine-config-daemon-6b7d5" podUID="74f832fc-6791-47d6-a9b3-07d923e053dc" Oct 11 04:24:50 crc kubenswrapper[4703]: I1011 04:24:50.533812 4703 scope.go:117] "RemoveContainer" containerID="93574af0cf2738114bd5085b74182463f92f2f4ce3cf1ce5560bcf2bff6e2eae" Oct 11 04:24:50 crc kubenswrapper[4703]: E1011 04:24:50.534550 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6b7d5_openshift-machine-config-operator(74f832fc-6791-47d6-a9b3-07d923e053dc)\"" pod="openshift-machine-config-operator/machine-config-daemon-6b7d5" podUID="74f832fc-6791-47d6-a9b3-07d923e053dc" Oct 11 04:25:05 crc kubenswrapper[4703]: I1011 04:25:05.534528 4703 scope.go:117] "RemoveContainer" containerID="93574af0cf2738114bd5085b74182463f92f2f4ce3cf1ce5560bcf2bff6e2eae" Oct 11 04:25:05 crc kubenswrapper[4703]: E1011 04:25:05.536139 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6b7d5_openshift-machine-config-operator(74f832fc-6791-47d6-a9b3-07d923e053dc)\"" pod="openshift-machine-config-operator/machine-config-daemon-6b7d5" podUID="74f832fc-6791-47d6-a9b3-07d923e053dc" Oct 11 04:25:19 crc kubenswrapper[4703]: I1011 04:25:19.544454 4703 scope.go:117] "RemoveContainer" containerID="93574af0cf2738114bd5085b74182463f92f2f4ce3cf1ce5560bcf2bff6e2eae" Oct 11 04:25:19 crc kubenswrapper[4703]: E1011 04:25:19.545454 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6b7d5_openshift-machine-config-operator(74f832fc-6791-47d6-a9b3-07d923e053dc)\"" pod="openshift-machine-config-operator/machine-config-daemon-6b7d5" podUID="74f832fc-6791-47d6-a9b3-07d923e053dc" Oct 11 04:25:31 crc kubenswrapper[4703]: I1011 04:25:31.540352 4703 scope.go:117] "RemoveContainer" containerID="93574af0cf2738114bd5085b74182463f92f2f4ce3cf1ce5560bcf2bff6e2eae" Oct 11 04:25:31 crc kubenswrapper[4703]: E1011 04:25:31.541565 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6b7d5_openshift-machine-config-operator(74f832fc-6791-47d6-a9b3-07d923e053dc)\"" pod="openshift-machine-config-operator/machine-config-daemon-6b7d5" podUID="74f832fc-6791-47d6-a9b3-07d923e053dc" Oct 11 04:25:43 crc kubenswrapper[4703]: I1011 04:25:43.534260 4703 scope.go:117] "RemoveContainer" containerID="93574af0cf2738114bd5085b74182463f92f2f4ce3cf1ce5560bcf2bff6e2eae" Oct 11 04:25:43 crc kubenswrapper[4703]: E1011 04:25:43.535515 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6b7d5_openshift-machine-config-operator(74f832fc-6791-47d6-a9b3-07d923e053dc)\"" pod="openshift-machine-config-operator/machine-config-daemon-6b7d5" podUID="74f832fc-6791-47d6-a9b3-07d923e053dc" Oct 11 04:25:55 crc kubenswrapper[4703]: I1011 04:25:55.534045 4703 scope.go:117] "RemoveContainer" containerID="93574af0cf2738114bd5085b74182463f92f2f4ce3cf1ce5560bcf2bff6e2eae" Oct 11 04:25:56 crc kubenswrapper[4703]: I1011 04:25:56.071518 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6b7d5" event={"ID":"74f832fc-6791-47d6-a9b3-07d923e053dc","Type":"ContainerStarted","Data":"f9be0cb3381b13eb1799240ad974d7626e02b44f27eaf9ac92295545e323f30a"}